- copy .env.sample to .env and replace openapi key and search key placeholders
- Create venv-environment by running: ./scripts/loadenv.sh or ./scripts/loadenv.ps1
- Build frontend in ./frontend with "npm run watch". This will build the frontend and put the packages in ./static whenever the source changes.
- Start flask app that serves the frontend:
- Activate python venv with "./.venv/bin/activate" or "./.venv/Scripts/Activate.ps1"
- Run the flask app: python -m flask run --port=5000 --host=127.0.0.1 --reload --debug
- Go to localhost:5000
This repo contains sample code for a simple chat webapp that integrates with Azure OpenAI. Note: some portions of the app use preview APIs.
- An existing Azure OpenAI resource and model deployment of a chat model (e.g.
gpt-35-turbo-16k
,gpt-4
) - To use Azure OpenAI on your data: an existing Azure Cognitive Search resource and index.
Please see README_azd.md for detailed instructions.
Click on the Deploy to Azure button and configure your settings in the Azure Portal as described in the Environment variables section.
Please see the section below for important information about adding authentication to your app.
-
Update the environment variables listed in
app.py
as described in the Environment variables section.These variables are required:
AZURE_OPENAI_RESOURCE
AZURE_OPENAI_MODEL
AZURE_OPENAI_KEY
These variables are optional:
AZURE_OPENAI_TEMPERATURE
AZURE_OPENAI_TOP_P
AZURE_OPENAI_MAX_TOKENS
AZURE_OPENAI_STOP_SEQUENCE
AZURE_OPENAI_SYSTEM_MESSAGE
See the documentation for more information on these parameters.
-
Start the app with
start.cmd
. This will build the frontend, install backend dependencies, and then start the app. -
You can see the local running app at http://127.0.0.1:5000.
More information about Azure OpenAI on your data
-
Update the
AZURE_OPENAI_*
environment variables as described above. -
To connect to your data, you need to specify an Azure Cognitive Search index to use. You can create this index yourself or use the Azure AI Studio to create the index for you.
These variables are required when adding your data:
AZURE_SEARCH_SERVICE
AZURE_SEARCH_INDEX
AZURE_SEARCH_KEY
These variables are optional:
AZURE_SEARCH_USE_SEMANTIC_SEARCH
AZURE_SEARCH_SEMANTIC_SEARCH_CONFIG
AZURE_SEARCH_INDEX_TOP_K
AZURE_SEARCH_ENABLE_IN_DOMAIN
AZURE_SEARCH_CONTENT_COLUMNS
AZURE_SEARCH_FILENAME_COLUMN
AZURE_SEARCH_TITLE_COLUMN
AZURE_SEARCH_URL_COLUMN
AZURE_SEARCH_VECTOR_COLUMNS
AZURE_SEARCH_QUERY_TYPE
AZURE_SEARCH_PERMITTED_GROUPS_COLUMN
AZURE_SEARCH_STRICTNESS
AZURE_OPENAI_EMBEDDING_ENDPOINT
AZURE_OPENAI_EMBEDDING_KEY
-
Start the app with
start.cmd
. This will build the frontend, install backend dependencies, and then start the app. -
You can see the local running app at http://127.0.0.1:5000.
To enable chat history, you will need to set up CosmosDB resources. The ARM template in the infrastructure
folder can be used to deploy an app service and a CosmosDB with the database and container configured. Then specify these additional environment variables:
AZURE_COSMOSDB_ACCOUNT
AZURE_COSMOSDB_DATABASE
AZURE_COSMOSDB_CONVERSATIONS_CONTAINER
AZURE_COSMOSDB_ACCOUNT_KEY
As above, start the app with start.cmd
, then visit the local running app at http://127.0.0.1:5000.
NOTE: If you've made code changes, be sure to build the app code with start.cmd
or start.sh
before you deploy, otherwise your changes will not be picked up. If you've updated any files in the frontend
folder, make sure you see updates to the files in the static
folder before you deploy.
You can use the Azure CLI to deploy the app from your local machine. Make sure you have version 2.48.1 or later.
If this is your first time deploying the app, you can use az webapp up. Run the following command from the root folder of the repo, updating the placeholder values to your desired app name, resource group, location, and subscription. You can also change the SKU if desired.
az webapp up --runtime PYTHON:3.10 --sku B1 --name <new-app-name> --resource-group <resource-group-name> --location <azure-region> --subscription <subscription-name>
If you've deployed the app previously, first run this command to update the appsettings to allow local code deployment:
az webapp config appsettings set -g <resource-group-name> -n <existing-app-name> --settings WEBSITE_WEBDEPLOY_USE_SCM=false
Check the runtime stack for your app by viewing the app service resource in the Azure Portal. If it shows "Python - 3.10", use PYTHON:3.10
in the runtime argument below. If it shows "Python - 3.11", use PYTHON:3.11
in the runtime argument below.
Check the SKU in the same way. Use the abbreviated SKU name in the argument below, e.g. for "Basic (B1)" the SKU is B1
.
Then, use the az webapp up
command to deploy your local code to the existing app:
az webapp up --runtime <runtime-stack> --sku <sku> --name <existing-app-name> --resource-group <resource-group-name>
Make sure that the app name and resource group match exactly for the app that was previously deployed.
Deployment will take several minutes. When it completes, you should be able to navigate to your app at {app-name}.azurewebsites.net.
After deployment, you will need to add an identity provider to provide authentication support in your app. See this tutorial for more information.
If you don't add an identity provider, the chat functionality of your app will be blocked to prevent unauthorized access to your resources and data. To remove this restriction, or add further access controls, update the logic in getUserInfoList
in frontend/src/pages/chat/Chat.tsx
. For example, disable the authorization check like so:
const getUserInfoList = async () => {
setShowAuthMessage(false);
}
Feel free to fork this repository and make your own modifications to the UX or backend logic. For example, you may want to change aspects of the chat display, or expose some of the settings in app.py
in the UI for users to try out different behaviors.
The landing chat page logo and headers are specified in frontend/src/pages/chat/Chat.tsx
:
<Stack className={styles.chatEmptyState}>
<img
src={Azure}
className={styles.chatIcon}
aria-hidden="true"
/>
<h1 className={styles.chatEmptyStateTitle}>Start chatting</h1>
<h2 className={styles.chatEmptyStateSubtitle}>This chatbot is configured to answer your questions</h2>
</Stack>
To update the logo, change src={Azure}
to point to your own SVG file, which you can put in frontend/src/assets
/
To update the headers, change the strings "Start chatting" and "This chatbot is configured to answer your questions" to your desired values.
The Citation panel is defined at the end of frontend/src/pages/chat/Chat.tsx
. The citations returned from Azure OpenAI On Your Data will include content
, title
, filepath
, and in some cases url
. You can customize the Citation section to use and display these as you like. For example, the "View Source" button will open the citation URL in a new tab when clicked:
const onViewSource = (citation: Citation) => {
if (citation.url) {
window.open(citation.url, "_blank");
}
};
<span
title={activeCitation.url}
tabIndex={0}
role="link"
onClick={() => onViewSource(activeCitation)}
onKeyDown={e => e.key === "Enter" || e.key === " " ? onViewSource(activeCitation) : null}
className={styles.viewSourceButton}
aria-label={activeCitation.url}
>
View Source
</span>
We recommend keeping these best practices in mind:
- Reset the chat session (clear chat) if the user changes any settings. Notify the user that their chat history will be lost.
- Clearly communicate to the user what impact each setting will have on their experience.
- When you rotate API keys for your AOAI or ACS resource, be sure to update the app settings for each of your deployed apps to use the new key.
- Pull in changes from
main
frequently to ensure you have the latest bug fixes and improvements, especially when using Azure OpenAI on your data.
Note: settings starting with AZURE_SEARCH
are only needed when using Azure OpenAI on your data. If not connecting to your data, you only need to specify AZURE_OPENAI
settings.
App Setting | Value | Note |
---|---|---|
AZURE_SEARCH_SERVICE | The name of your Azure Cognitive Search resource | |
AZURE_SEARCH_INDEX | The name of your Azure Cognitive Search Index | |
AZURE_SEARCH_KEY | An admin key for your Azure Cognitive Search resource | |
AZURE_SEARCH_USE_SEMANTIC_SEARCH | False | Whether or not to use semantic search |
AZURE_SEARCH_QUERY_TYPE | simple | Query type: simple, semantic, vector, vectorSimpleHybrid, or vectorSemanticHybrid. Takes precedence over AZURE_SEARCH_USE_SEMANTIC_SEARCH |
AZURE_SEARCH_SEMANTIC_SEARCH_CONFIG | The name of the semantic search configuration to use if using semantic search. | |
AZURE_SEARCH_TOP_K | 5 | The number of documents to retrieve from Azure Cognitive Search. |
AZURE_SEARCH_ENABLE_IN_DOMAIN | True | Limits responses to only queries relating to your data. |
AZURE_SEARCH_CONTENT_COLUMNS | List of fields in your Azure Cognitive Search index that contains the text content of your documents to use when formulating a bot response. Represent these as a string joined with " | |
AZURE_SEARCH_FILENAME_COLUMN | Field from your Azure Cognitive Search index that gives a unique idenitfier of the source of your data to display in the UI. | |
AZURE_SEARCH_TITLE_COLUMN | Field from your Azure Cognitive Search index that gives a relevant title or header for your data content to display in the UI. | |
AZURE_SEARCH_URL_COLUMN | Field from your Azure Cognitive Search index that contains a URL for the document, e.g. an Azure Blob Storage URI. This value is not currently used. | |
AZURE_SEARCH_VECTOR_COLUMNS | List of fields in your Azure Cognitive Search index that contain vector embeddings of your documents to use when formulating a bot response. Represent these as a string joined with " | |
AZURE_SEARCH_PERMITTED_GROUPS_COLUMN | Field from your Azure Cognitive Search index that contains AAD group IDs that determine document-level access control. | |
AZURE_SEARCH_STRICTNESS | 3 | Integer from 1 to 5 specifying the strictness for the model limiting responses to your data. |
AZURE_OPENAI_RESOURCE | the name of your Azure OpenAI resource | |
AZURE_OPENAI_MODEL | The name of your model deployment | |
AZURE_OPENAI_ENDPOINT | The endpoint of your Azure OpenAI resource. | |
AZURE_OPENAI_MODEL_NAME | gpt-35-turbo-16k | The name of the model |
AZURE_OPENAI_KEY | One of the API keys of your Azure OpenAI resource | |
AZURE_OPENAI_TEMPERATURE | 0 | What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. A value of 0 is recommended when using your data. |
AZURE_OPENAI_TOP_P | 1.0 | An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. We recommend setting this to 1.0 when using your data. |
AZURE_OPENAI_MAX_TOKENS | 1000 | The maximum number of tokens allowed for the generated answer. |
AZURE_OPENAI_STOP_SEQUENCE | Up to 4 sequences where the API will stop generating further tokens. Represent these as a string joined with " | |
AZURE_OPENAI_SYSTEM_MESSAGE | You are an AI assistant that helps people find information. | A brief description of the role and tone the model should use |
AZURE_OPENAI_PREVIEW_API_VERSION | 2023-06-01-preview | API version when using Azure OpenAI on your data |
AZURE_OPENAI_STREAM | True | Whether or not to use streaming for the response |
AZURE_OPENAI_EMBEDDING_ENDPOINT | The endpoint for your Ada embedding model deployment if using vector search. | |
AZURE_OPENAI_EMBEDDING_KEY | The key for the Azure OpenAI resource with the Ada deployment to use with vector search. |
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.
When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.
This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.