Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: 'temperature' does not support 0.0 with this model (O1-mini) #4131

Closed
2 tasks done
AlexCuadron opened this issue Oct 1, 2024 · 12 comments
Closed
2 tasks done
Labels
bug Something isn't working severity:medium Affecting multiple users Stale Inactive for 30 days

Comments

@AlexCuadron
Copy link
Contributor

Is there an existing issue for the same bug?

Describe the bug

I tried running OpenHands with O1-mini as my default model and the model keeps "encountering an error" as described by the UI. However, upon closer inspection of the terminal. This seems to be the problem:
litellm.exceptions.BadRequestError: litellm.BadRequestError: OpenAIException - Error code: 400 - {'error': {'message': "Unsupported value: 'temperature' does not support 0.0 with this model. Only the default (1) value is supported.", 'type': 'invalid_request_error', 'param': 'temperature', 'code': 'unsupported_value'}}

Current OpenHands version

I am on the main branch, at commit: b4626ab93ebac96d87d239432f49f8af44cd0744

Installation and Configuration

git clone https://github.com/AlexCuadron/OpenHands.git
cd OpenHands
git remote add upstream [email protected]:All-Hands-AI/OpenHands.git
git remote -v
git fetch upstream\ngit checkout main\ngit merge upstream/main\ngit push origin main
conda create --name open-hands python=3.11 conda-forge::nodejs conda-forge::poetry
conda activate open-hands
brew install netcat
pip install git+https://github.com/OpenDevin/SWE-bench.git@7b0c4b1c249ed4b4600a5bba8afb916d543e034a
make build
poetry self update
make build
make setup-config
Setting up config.toml...
Enter your workspace directory (as absolute path) [default: ./workspace]:
Enter your LLM model name, used for running without UI. Set the model in the UI after you start the app. (see https://docs.litellm.ai/docs/providers for full list) [default: gpt-4o]: o1-mini
Enter your LLM api key:
Enter your LLM base URL [mostly used for local LLMs, leave blank if not needed - example: http://localhost:5001/v1/]:
Enter your LLM Embedding Model
Choices are:
  - openai
  - azureopenai
  - Embeddings available only with OllamaEmbedding:
    - llama2
    - mxbai-embed-large
    - nomic-embed-text
    - all-minilm
    - stable-code
    - bge-m3
    - bge-large
    - paraphrase-multilingual
    - snowflake-arctic-embed
  - Leave blank to default to 'BAAI/bge-small-en-v1.5' via huggingface
>
Config.toml setup completed.

export DEBUG=1

Model and Agent

  • o1-mini

Operating System

MacOS

Reproduction Steps

  1. Open OpenHands
  2. Ask the LLM what model are you?
  3. The agent will find an error.

Logs, Errors, Screenshots, and Additional Context

==============
CodeActAgent LEVEL 0 LOCAL STEP 0 GLOBAL STEP 0

17:36:41 - openhands:DEBUG: logger.py:238 - Logging to /Users/acuadron/Documents/OpenDevin/OpenHands/logs/llm/24-09-30_17-30/prompt_002.log

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

Traceback (most recent call last):
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/litellm/llms/OpenAI/openai.py", line 778, in completion
    raise e
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/litellm/llms/OpenAI/openai.py", line 714, in completion
    self.make_sync_openai_chat_completion_request(
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/litellm/llms/OpenAI/openai.py", line 573, in make_sync_openai_chat_completion_request
    raise e
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/litellm/llms/OpenAI/openai.py", line 562, in make_sync_openai_chat_completion_request
    raw_response = openai_client.chat.completions.with_raw_response.create(
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/openai/_legacy_response.py", line 353, in wrapped
    return cast(LegacyAPIResponse[R], func(*args, **kwargs))
                                      ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/openai/_utils/_utils.py", line 274, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/openai/resources/chat/completions.py", line 704, in create
    return self._post(
           ^^^^^^^^^^^
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/openai/_base_client.py", line 1270, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/openai/_base_client.py", line 947, in request
    return self._request(
           ^^^^^^^^^^^^^^
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/openai/_base_client.py", line 1051, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported value: 'temperature' does not support 0.0 with this model. Only the default (1) value is supported.", 'type': 'invalid_request_error', 'param': 'temperature', 'code': 'unsupported_value'}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/litellm/main.py", line 1470, in completion
    raise e
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/litellm/main.py", line 1423, in completion
    response = openai_o1_chat_completions.completion(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/litellm/llms/OpenAI/chat/o1_handler.py", line 58, in completion
    response = super().completion(
               ^^^^^^^^^^^^^^^^^^^
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/litellm/llms/OpenAI/openai.py", line 788, in completion
    raise OpenAIError(
litellm.llms.OpenAI.openai.OpenAIError: Error code: 400 - {'error': {'message': "Unsupported value: 'temperature' does not support 0.0 with this model. Only the default (1) value is supported.", 'type': 'invalid_request_error', 'param': 'temperature', 'code': 'unsupported_value'}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/acuadron/Documents/OpenDevin/OpenHands/openhands/controller/agent_controller.py", line 155, in start_step_loop
    await self._step()
  File "/Users/acuadron/Documents/OpenDevin/OpenHands/openhands/controller/agent_controller.py", line 422, in _step
    action = self.agent.step(self.state)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/acuadron/Documents/OpenDevin/OpenHands/agenthub/codeact_agent/codeact_agent.py", line 207, in step
    response = self.llm.completion(**params)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/tenacity/__init__.py", line 336, in wrapped_f
    return copy(f, *args, **kw)
           ^^^^^^^^^^^^^^^^^^^^
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/tenacity/__init__.py", line 475, in __call__
    do = self.iter(retry_state=retry_state)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/tenacity/__init__.py", line 376, in iter
    result = action(retry_state)
             ^^^^^^^^^^^^^^^^^^^
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/tenacity/__init__.py", line 398, in <lambda>
    self._add_action_func(lambda rs: rs.outcome.result())
                                     ^^^^^^^^^^^^^^^^^^^
  File "/Users/acuadron/miniconda3/envs/open-hands/lib/python3.11/concurrent/futures/_base.py", line 449, in result
    return self.__get_result()
           ^^^^^^^^^^^^^^^^^^^
  File "/Users/acuadron/miniconda3/envs/open-hands/lib/python3.11/concurrent/futures/_base.py", line 401, in __get_result
    raise self._exception
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/tenacity/__init__.py", line 478, in __call__
    result = fn(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^
  File "/Users/acuadron/Documents/OpenDevin/OpenHands/openhands/llm/llm.py", line 187, in wrapper
    resp: ModelResponse = completion_unwrapped(*args, **kwargs)
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/litellm/utils.py", line 1058, in wrapper
    raise e
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/litellm/utils.py", line 946, in wrapper
    result = original_function(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/litellm/main.py", line 2900, in completion
    raise exception_type(
          ^^^^^^^^^^^^^^^
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2033, in exception_type
    raise e
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 297, in exception_type
    raise BadRequestError(
litellm.exceptions.BadRequestError: litellm.BadRequestError: OpenAIException - Error code: 400 - {'error': {'message': "Unsupported value: 'temperature' does not support 0.0 with this model. Only the default (1) value is supported.", 'type': 'invalid_request_error', 'param': 'temperature', 'code': 'unsupported_value'}}
17:36:42 - openhands:ERROR: agent_controller.py:161 - Error while running the agent: litellm.BadRequestError: OpenAIException - Error code: 400 - {'error': {'message': "Unsupported value: 'temperature' does not support 0.0 with this model. Only the default (1) value is supported.", 'type': 'invalid_request_error', 'param': 'temperature', 'code': 'unsupported_value'}}
17:36:42 - openhands:ERROR: agent_controller.py:162 - Traceback (most recent call last):
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/litellm/llms/OpenAI/openai.py", line 778, in completion
    raise e
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/litellm/llms/OpenAI/openai.py", line 714, in completion
    self.make_sync_openai_chat_completion_request(
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/litellm/llms/OpenAI/openai.py", line 573, in make_sync_openai_chat_completion_request
    raise e
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/litellm/llms/OpenAI/openai.py", line 562, in make_sync_openai_chat_completion_request
    raw_response = openai_client.chat.completions.with_raw_response.create(
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/openai/_legacy_response.py", line 353, in wrapped
    return cast(LegacyAPIResponse[R], func(*args, **kwargs))
                                      ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/openai/_utils/_utils.py", line 274, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/openai/resources/chat/completions.py", line 704, in create
    return self._post(
           ^^^^^^^^^^^
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/openai/_base_client.py", line 1270, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/openai/_base_client.py", line 947, in request
    return self._request(
           ^^^^^^^^^^^^^^
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/openai/_base_client.py", line 1051, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported value: 'temperature' does not support 0.0 with this model. Only the default (1) value is supported.", 'type': 'invalid_request_error', 'param': 'temperature', 'code': 'unsupported_value'}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/litellm/main.py", line 1470, in completion
    raise e
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/litellm/main.py", line 1423, in completion
    response = openai_o1_chat_completions.completion(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/litellm/llms/OpenAI/chat/o1_handler.py", line 58, in completion
    response = super().completion(
               ^^^^^^^^^^^^^^^^^^^
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/litellm/llms/OpenAI/openai.py", line 788, in completion
    raise OpenAIError(
litellm.llms.OpenAI.openai.OpenAIError: Error code: 400 - {'error': {'message': "Unsupported value: 'temperature' does not support 0.0 with this model. Only the default (1) value is supported.", 'type': 'invalid_request_error', 'param': 'temperature', 'code': 'unsupported_value'}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/acuadron/Documents/OpenDevin/OpenHands/openhands/controller/agent_controller.py", line 155, in start_step_loop
    await self._step()
  File "/Users/acuadron/Documents/OpenDevin/OpenHands/openhands/controller/agent_controller.py", line 422, in _step
    action = self.agent.step(self.state)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/acuadron/Documents/OpenDevin/OpenHands/agenthub/codeact_agent/codeact_agent.py", line 207, in step
    response = self.llm.completion(**params)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/tenacity/__init__.py", line 336, in wrapped_f
    return copy(f, *args, **kw)
           ^^^^^^^^^^^^^^^^^^^^
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/tenacity/__init__.py", line 475, in __call__
    do = self.iter(retry_state=retry_state)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/tenacity/__init__.py", line 376, in iter
    result = action(retry_state)
             ^^^^^^^^^^^^^^^^^^^
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/tenacity/__init__.py", line 398, in <lambda>
    self._add_action_func(lambda rs: rs.outcome.result())
                                     ^^^^^^^^^^^^^^^^^^^
  File "/Users/acuadron/miniconda3/envs/open-hands/lib/python3.11/concurrent/futures/_base.py", line 449, in result
    return self.__get_result()
           ^^^^^^^^^^^^^^^^^^^
  File "/Users/acuadron/miniconda3/envs/open-hands/lib/python3.11/concurrent/futures/_base.py", line 401, in __get_result
    raise self._exception
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/tenacity/__init__.py", line 478, in __call__
    result = fn(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^
  File "/Users/acuadron/Documents/OpenDevin/OpenHands/openhands/llm/llm.py", line 187, in wrapper
    resp: ModelResponse = completion_unwrapped(*args, **kwargs)
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/litellm/utils.py", line 1058, in wrapper
    raise e
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/litellm/utils.py", line 946, in wrapper
    result = original_function(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/litellm/main.py", line 2900, in completion
    raise exception_type(
          ^^^^^^^^^^^^^^^
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2033, in exception_type
    raise e
  File "/Users/acuadron/Library/Caches/pypoetry/virtualenvs/openhands-ai-RniBYVEh-py3.11/lib/python3.11/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 297, in exception_type
    raise BadRequestError(
litellm.exceptions.BadRequestError: litellm.BadRequestError: OpenAIException - Error code: 400 - {'error': {'message': "Unsupported value: 'temperature' does not support 0.0 with this model. Only the default (1) value is supported.", 'type': 'invalid_request_error', 'param': 'temperature', 'code': 'unsupported_value'}}
@mamoodi
Copy link
Collaborator

mamoodi commented Oct 1, 2024

@neubig I thought you had fixed something related to this?

@neubig
Copy link
Contributor

neubig commented Oct 1, 2024

Hmm, I did think I fixed it in this: #4012

Let me see if I can repro.

@neubig
Copy link
Contributor

neubig commented Oct 1, 2024

BTW @AlexCuadron , for the time being could you try

export LLM_TEMPERATURE=1

or add it to config.toml in the top OpenHands directory (you can see an example in config.template.toml).

@AlexCuadron
Copy link
Contributor Author

That works @neubig thx!

@neubig
Copy link
Contributor

neubig commented Oct 1, 2024

Awesome! And let me re-open it, because we'd still like to fix this so it works without the workaround.

@neubig neubig reopened this Oct 1, 2024
@tobitege
Copy link
Collaborator

tobitege commented Oct 1, 2024

I don't think there is something to fix in code and that the workaround is actually the solution.
The #4012 PR takes care of unsupported params, but temperature is supported for e.g. other OpenAI models, just not o1.
Thus, imo, setting the temperature on model-level is the correct way to go.

[llm.gpt-o1-mini]
temperature=1

@neubig
Copy link
Contributor

neubig commented Oct 2, 2024

I see what you're saying, but that would mean that users who select o1-mini or o1-preview through the UI would have to first parse and understand the error message, find documentation about how to fix the problem (e.g. this issue), and then create a config.toml file with that setting. I think that UX-wise it'd be preferable for it to just work, so if they select o1-mini it will automatically default to temperature settings that work with o1 (possibly while throwing a warning to the user so they might notice that this happened).

@tobitege
Copy link
Collaborator

tobitege commented Oct 2, 2024

I agree that it ideally should work out of the box, I'm just wondering how to best deal with these "exceptions to the rule".
Maybe we can have a separate, specific py file containing these special cases for arg/param changes, outside of the actual LLM class, so it's easy to find/maintain.

I'd assume that there are no plans to bring such LLM options into the UI in the near future or at all?

@neubig
Copy link
Contributor

neubig commented Oct 2, 2024

I really like the idea of having all of these exceptions in a single place! I'm actually still a bit confused that litellm doesn't handle this though...

I think that having this sort of configuration in the UI might be worth it if we could think of a way to present it elegantly.

@mamoodi mamoodi added the severity:medium Affecting multiple users label Oct 2, 2024
@Aatif123-hub
Copy link

I was working on this problem for a while and i got to know that the temperature setting is not available in the "o1-preview" or "o1-mini" models since it is still in the beta version. Like the previous comments, you cn set the temperature as 1 for now or you need not add the temperature parameter. It works either way. Plus it does not have tool compatability like(wikipideia or arxiv websraper).

I am attaching a link which has all the limitations of the model. Hopefully they add these functionalities in the future.

https://platform.openai.com/docs/guides/reasoning/quickstart

check this link and scroll down to beta limitations and see all its limitations

Copy link
Contributor

This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 7 days.

@github-actions github-actions bot added the Stale Inactive for 30 days label Nov 17, 2024
Copy link
Contributor

This issue was closed because it has been stalled for over 30 days with no activity.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Nov 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working severity:medium Affecting multiple users Stale Inactive for 30 days
Projects
None yet
Development

No branches or pull requests

5 participants