This workshop provides hands-on experience with the end-to-end lifecycle of Large Language Model (LLM) applications using Azure AI platform. The workshop covers the LLM application development lifecycle shown in the below diagram.
This workshop provides hands-on experience with the following scenarios:
- Prompt Engineering techniques.
- Create a hybrid search solution using Azure AI Search that includes vector search.
- Create a RAG (Retrieval Augmented Generation) application using PromptFlow in Azure AI Studio.
- Test & evaluate RAG application using PromptFlow in Azure AI Studio.
- Deploy RAG application and consume it using REST API.
Diagram below depicts the Azure AI platform components used in the workshop scenarios.
Lab instructions and resources are provided in the respective folders in this repo.
[RECOMMENDED SEQUENCE] Follow the instructions in the folders to complete the labs.
- Lab 0: Azure Environment Setup for the workshop
- Lab 1: Prompt Engineering Techniques
- Lab 2: Create a hybrid search solution using Azure AI Search
- Lab 3: Create a RAG application using PromptFlow in Azure AI Studio
- Lab 4: Test & evaluate RAG application using PromptFlow in Azure AI Studio
- Lab 5: Deploy RAG application and consume it using REST API
- Azure subscription.
- Access to Azure OpenAI service.
- Azure AI Search service
- Azure Storage account.