Skip to content

Hands-on workshop for end-to-end LLM lifecycle using Azure AI platform

Notifications You must be signed in to change notification settings

amulchapla/GenAI-LLMapps-lifecycle-workshop

Repository files navigation

LLM Application Development Lifecycle workshop

Workshop Overview

This workshop provides hands-on experience with the end-to-end lifecycle of Large Language Model (LLM) applications using Azure AI platform. The workshop covers the LLM application development lifecycle shown in the below diagram.

Lab Scenarios

This workshop provides hands-on experience with the following scenarios:

  1. Prompt Engineering techniques.
  2. Create a hybrid search solution using Azure AI Search that includes vector search.
  3. Create a RAG (Retrieval Augmented Generation) application using PromptFlow in Azure AI Studio.
  4. Test & evaluate RAG application using PromptFlow in Azure AI Studio.
  5. Deploy RAG application and consume it using REST API.

Diagram below depicts the Azure AI platform components used in the workshop scenarios.

Workshop hands-on labs Implementation Instructions

Lab instructions and resources are provided in the respective folders in this repo.

[RECOMMENDED SEQUENCE] Follow the instructions in the folders to complete the labs.

  1. Lab 0: Azure Environment Setup for the workshop
  2. Lab 1: Prompt Engineering Techniques
  3. Lab 2: Create a hybrid search solution using Azure AI Search
  4. Lab 3: Create a RAG application using PromptFlow in Azure AI Studio
  5. Lab 4: Test & evaluate RAG application using PromptFlow in Azure AI Studio
  6. Lab 5: Deploy RAG application and consume it using REST API

Workshop Prerequisites

  1. Azure subscription.
  2. Access to Azure OpenAI service.
  3. Azure AI Search service
  4. Azure Storage account.

Additional Resources

About

Hands-on workshop for end-to-end LLM lifecycle using Azure AI platform

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages