From 7d1a2b5a13e016e1dbb0f58b107f0c8c6e102b31 Mon Sep 17 00:00:00 2001 From: Hiroshi Yoshioka <40815708+hyoshioka0128@users.noreply.github.com> Date: Wed, 22 May 2024 00:45:33 +0900 Subject: [PATCH] =?UTF-8?q?Update=20README.md=20(Typo=20"Azure=20Open=20AI?= =?UTF-8?q?"=E2=86=92"Azure=20OpenAI")?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit https://github.com/Azure-Samples/rag-data-openai-python-promptflow/blob/main/README.md #PingMSFTDocs --- README.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/README.md b/README.md index df270dd..d286a94 100644 --- a/README.md +++ b/README.md @@ -110,7 +110,7 @@ This sample repository contains a sample chat prompty file you can explore. This This pattern was covered in the [hello world prompting sample](https://github.com/Azure-Samples/ai-studio-hello-world), showing how the Prompty file format let's you streamline your LLM calls. -You can test your connection to your Azure Open AI model by running only the sample prompt. Try changing up the specified system prompt to see how the model behaves with additional prompting. +You can test your connection to your Azure OpenAI model by running only the sample prompt. Try changing up the specified system prompt to see how the model behaves with additional prompting. ``` bash cd .. @@ -152,8 +152,8 @@ The code follows the following general logic: 1. Generates a search query based on user query intent and any chat history 1. Uses an embedding model to embed the query 1. Retrieves relevant documents from the search index, given the query -1. Passes the relevant context to the Azure Open AI chat completion model -1. Returns the response from the Azure Open AI model +1. Passes the relevant context to the Azure OpenAI chat completion model +1. Returns the response from the Azure OpenAI model You can modify this logic as appropriate to fit your use case.