Vectorizing Images with LLMs
In this video, we walk through how to build a complete end-to-end image vectorization and similarity search system using modern AI tools, multimodal LLMs, and GPU-powered infrastructure.
In this video, we walk through how to build a complete end-to-end image vectorization and similarity search system using modern AI tools, multimodal LLMs, and GPU-powered infrastructure.
In this hands-on walkthrough, I’ll show you how to set up Open WebUI with OpenAI’s GPT-OSS-20B — a 20-billion-parameter open model — running locally inside Docker.
Leverage Azure OpenAI in a Fabric Jupyter notebook to summarize user review data, securely store secrets in Azure Key Vault, and handle data in Fabric Lakehouse’s Delta tables.
Last week Google shipped the first version of its Gemini large language model and made APIs available to developers. In this post we'll build a fully functional Gemini Chatbot using Streamlit!
This video is an end--to end walk through using Microsoft Copilot Studio to create a custom Microsoft Teams copilot that answers questions about organizational data. The walk through starts from scratch and ends with a finished Copilot published to a Teams channel.
This is a quick demo of how I use Bing Image Creator and Photoshop Generative AI together to create AI-generated images.
This post discusses Azure AI Search integrated vector embeddings for Retrieval Augmented Generation (RAG) when using a large language model for generative AI Q&A solutions, and provides steps to build a vector index using Azure AI Search.
This post discusses how to create vector embeddings for documents that are too large to fit into a single embedding vector by chunking data using text splitting. This technique is commonly used when designing Generative AI solutions using Large Language Models.
This post demonstrates how to use Retrieval Augmented Generation (RAG) to query OpenAI's Large Language Model (LLM) using a Vector database. This technique leverages the language understanding and summarization capabilities of Generative AI while introducing semantic understanding of our own data.