-
Notifications
You must be signed in to change notification settings - Fork 5.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: 'temperature' does not support 0.0 with this model (O1-mini) #4131
Comments
@neubig I thought you had fixed something related to this? |
Hmm, I did think I fixed it in this: #4012 Let me see if I can repro. |
BTW @AlexCuadron , for the time being could you try export LLM_TEMPERATURE=1 or add it to |
That works @neubig thx! |
Awesome! And let me re-open it, because we'd still like to fix this so it works without the workaround. |
I don't think there is something to fix in code and that the workaround is actually the solution.
|
I see what you're saying, but that would mean that users who select |
I agree that it ideally should work out of the box, I'm just wondering how to best deal with these "exceptions to the rule". I'd assume that there are no plans to bring such LLM options into the UI in the near future or at all? |
I really like the idea of having all of these exceptions in a single place! I'm actually still a bit confused that litellm doesn't handle this though... I think that having this sort of configuration in the UI might be worth it if we could think of a way to present it elegantly. |
I was working on this problem for a while and i got to know that the temperature setting is not available in the "o1-preview" or "o1-mini" models since it is still in the beta version. Like the previous comments, you cn set the temperature as 1 for now or you need not add the temperature parameter. It works either way. Plus it does not have tool compatability like(wikipideia or arxiv websraper). I am attaching a link which has all the limitations of the model. Hopefully they add these functionalities in the future. https://platform.openai.com/docs/guides/reasoning/quickstart check this link and scroll down to beta limitations and see all its limitations |
This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 7 days. |
This issue was closed because it has been stalled for over 30 days with no activity. |
Is there an existing issue for the same bug?
Describe the bug
I tried running OpenHands with O1-mini as my default model and the model keeps "encountering an error" as described by the UI. However, upon closer inspection of the terminal. This seems to be the problem:
litellm.exceptions.BadRequestError: litellm.BadRequestError: OpenAIException - Error code: 400 - {'error': {'message': "Unsupported value: 'temperature' does not support 0.0 with this model. Only the default (1) value is supported.", 'type': 'invalid_request_error', 'param': 'temperature', 'code': 'unsupported_value'}}
Current OpenHands version
Installation and Configuration
Model and Agent
Operating System
MacOS
Reproduction Steps
Logs, Errors, Screenshots, and Additional Context
The text was updated successfully, but these errors were encountered: