Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: LightsBot does not work when supplied an Azure OpenAI API key and running locally against Microsoft 365 Developer Program Teams instance #2246

Open
adc-cjewett opened this issue Dec 24, 2024 · 3 comments
Assignees
Labels
bug Something isn't working

Comments

@adc-cjewett
Copy link

adc-cjewett commented Dec 24, 2024

Language

C#

Version

latest

Description

I wanted to check out the most recent updates to the Microsoft Teams bot space in regards to AI and can't seem to get the simple examples with Azure OpenAI to work. I can run the EchoBot succesfully. LightBot and ChefBot fail with the same exception though.

I receive the following exception when attempting to send the same light actions as the sample suggests.

System.Exception: Operation returned an invalid status code 'BadRequest'
 ---> Microsoft.Teams.AI.Exceptions.TeamsAIException: Operation returned an invalid status code 'BadRequest'
   --- End of inner exception stack trace ---
   at Microsoft.Teams.AI.AI.Planners.ActionPlanner`1.ContinueTaskAsync(ITurnContext context, TState state, AI`1 ai, CancellationToken cancellationToken)
   at Microsoft.Teams.AI.AI.Planners.ActionPlanner`1.BeginTaskAsync(ITurnContext context, TState state, AI`1 ai, CancellationToken cancellationToken)
   at Microsoft.Teams.AI.AI.AI`1.RunAsync(ITurnContext turnContext, TState turnState, Nullable`1 startTime, Int32 stepCount, CancellationToken cancellationToken)
   at Microsoft.Teams.AI.Application`1._OnTurnAsync(ITurnContext turnContext, CancellationToken cancellationToken)
   at Microsoft.Teams.AI.Application`1.OnTurnAsync(ITurnContext turnContext, CancellationToken cancellationToken)
   at Microsoft.Bot.Builder.MiddlewareSet.ReceiveActivityWithStatusAsync(ITurnContext turnContext, BotCallbackHandler callback, CancellationToken cancellationToken)
   at Microsoft.Bot.Builder.BotAdapter.RunPipelineAsync(ITurnContext turnContext, BotCallbackHandler callback, CancellationToken cancellationToken)

I'm using an Azure OpenAI deployment that we use for another bot that was created prior to all the Microsoft.Teams.AI library developments, so we know it works in some capacity. It's possible I might be screwing up the configurations for these sample applications somehow though.

The Azure OpenAI key and endpoint I am using are from here:
Image

I've tried both the core domain name and with all the extra paths and query parameters. Both fail. Any thoughts on what could be going wrong?

Reproduction Steps

1. Open LightsBot.sln
2. Follow the instructions in the QuickStart guide to setup the Dev Tunnel and Prepare App Dependencies. https://github.com/microsoft/teams-ai/blob/main/getting-started/QUICKSTART.md#build-and-run-the-sample-app
3. Update appsettings.Development.json, Program.cs, and Prompts/tools/config.json to use the correct Azure OpenAI API key, Azure OpenAI Endpoint, and Model. 
4. Build and run LightBot.csproj in Debug mode from Visual Studio.
5. Browser will open, navigate to Teams, and prompt to install the bot. If bot isn't installed then will prompt to open.
6. Open the chat with the Bot.
7. Send a `Turn lights on` message.
8. Receive exception back and the exception isn't very helpful in determining what went wrong exactly. I suspect it's due to Azure OpenAI because the EchoBot sample works fine and I'm able to install all of the different sample applications.
@adc-cjewett adc-cjewett added the bug Something isn't working label Dec 24, 2024
@Nivedipa-MSFT
Copy link

@adc-cjewett - Thank you for bringing this issue to our attention. We will look into it and get back to you shortly.

@SubbaReddi SubbaReddi self-assigned this Dec 24, 2024
@SubbaReddi
Copy link
Contributor

@adc-cjewett : I see it is leading to bad request error with gpt-4 model. Can you add gpt-4o model and keep the same in Program.cs file and validate?

@adc-cjewett
Copy link
Author

@SubbaReddi It looks like changing the model to gpt-4o did work. Thank you!

We have both gpt-4 and gpt-4o models. Most of our applications are currently running gpt-4 as we prepare to move to gpt-4o. Are these samples only compatible with gpt-4o and later?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants