From 428eef2b4efe7100013a155c2ea04a209e959489 Mon Sep 17 00:00:00 2001 From: "John Alexander (MSFT)" <174467815+ms-johnalex@users.noreply.github.com> Date: Mon, 30 Sep 2024 18:04:54 -0500 Subject: [PATCH] Update README.md Removed sentence about local model usage support --- README.md | 1 - 1 file changed, 1 deletion(-) diff --git a/README.md b/README.md index 9c03d0e..d2513eb 100644 --- a/README.md +++ b/README.md @@ -28,7 +28,6 @@ since the local app needs credentials for Azure OpenAI to work properly. * A Python [Quart](https://quart.palletsprojects.com/en/latest/) that uses the [openai](https://pypi.org/project/openai/) package to generate responses to user messages. * A basic HTML/JS frontend that streams responses from the backend using [JSON Lines](http://jsonlines.org/) over a [ReadableStream](https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream). * [Bicep files](https://docs.microsoft.com/azure/azure-resource-manager/bicep/) for provisioning Azure resources, including Azure OpenAI, Azure Container Apps, Azure Container Registry, Azure Log Analytics, and RBAC roles. -* Support for using [local LLMs](/docs/local_ollama.md) during development. ![Screenshot of the chat app](docs/screenshot_chatapp.png)