Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: OpenRouter GPT-4o-mini model is not working #4920

Closed
1 task done
shaxxx opened this issue Nov 12, 2024 · 15 comments
Closed
1 task done

[Bug]: OpenRouter GPT-4o-mini model is not working #4920

shaxxx opened this issue Nov 12, 2024 · 15 comments
Assignees
Labels
bug Something isn't working severity:medium Affecting multiple users Stale Inactive for 30 days

Comments

@shaxxx
Copy link

shaxxx commented Nov 12, 2024

Is there an existing issue for the same bug?

  • I have checked the existing issues.

Describe the bug and reproduction steps

When openrouter/openai/gpt-4o-mini model is entered in Advanced options Custom model field, OpenHands will return error in console and will not respond to prompt. If I enter any other model (leaving all other settings unchanged) it will work as expected (ie. openrouter/openai/o1-mini). I can reproduce it with any prompt (ie. "Say hello").
And yes, I've triple checked the key, and again, other models work without changing the key
And why there is no 'openai/gpt-4o-mini' option in the openrouter models? It's price and ranking (currently #1 in "Programming/scripting category" makes it default option for most of the tasks.

OpenHands Installation

Docker command in README

OpenHands Version

0.12

Operating System

WSL on Windows

Logs, Errors, Screenshots, and Additional Context

2024-11-12 08:02:56 ==============
2024-11-12 08:02:56 [Agent Controller 8e811def-d626-466f-8dd1-ed1db210ab92] LEVEL 0 LOCAL STEP 0 GLOBAL STEP 0
2024-11-12 08:02:56
2024-11-12 08:02:57
2024-11-12 08:02:57 Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
2024-11-12 08:02:57 LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.
2024-11-12 08:02:57
2024-11-12 08:02:57 Traceback (most recent call last):
2024-11-12 08:02:57 File "/app/.venv/lib/python3.12/site-packages/litellm/llms/OpenAI/openai.py", line 854, in completion
2024-11-12 08:02:57 raise e
2024-11-12 08:02:57 File "/app/.venv/lib/python3.12/site-packages/litellm/llms/OpenAI/openai.py", line 805, in completion
2024-11-12 08:02:57 return convert_to_model_response_object(
2024-11-12 08:02:57 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-11-12 08:02:57 File "/app/.venv/lib/python3.12/site-packages/litellm/litellm_core_utils/llm_response_utils/convert_dict_to_response.py", line 289, in convert_to_model_response_object
2024-11-12 08:02:57 raise raised_exception
2024-11-12 08:02:57 Exception
2024-11-12 08:02:57
2024-11-12 08:02:57 During handling of the above exception, another exception occurred:
2024-11-12 08:02:57
2024-11-12 08:02:57 Traceback (most recent call last):
2024-11-12 08:02:57 File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 2168, in completion
2024-11-12 08:02:57 response = openai_chat_completions.completion(
2024-11-12 08:02:57 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-11-12 08:02:57 File "/app/.venv/lib/python3.12/site-packages/litellm/llms/OpenAI/openai.py", line 864, in completion
2024-11-12 08:02:57 raise OpenAIError(
2024-11-12 08:02:57 litellm.llms.OpenAI.openai.OpenAIError
2024-11-12 08:02:57
2024-11-12 08:02:57 During handling of the above exception, another exception occurred:
2024-11-12 08:02:57
2024-11-12 08:02:57 Traceback (most recent call last):
2024-11-12 08:02:57 File "/app/openhands/controller/agent_controller.py", line 168, in start_step_loop
2024-11-12 08:02:57 await self._step()
2024-11-12 08:02:57 File "/app/openhands/controller/agent_controller.py", line 464, in _step
2024-11-12 08:02:57 action = self.agent.step(self.state)
2024-11-12 08:02:57 ^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-11-12 08:02:57 File "/app/openhands/agenthub/codeact_agent/codeact_agent.py", line 359, in step
2024-11-12 08:02:57 response = self.llm.completion(**params)
2024-11-12 08:02:57 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-11-12 08:02:57 File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 336, in wrapped_f
2024-11-12 08:02:57 return copy(f, *args, **kw)
2024-11-12 08:02:57 ^^^^^^^^^^^^^^^^^^^^
2024-11-12 08:02:57 File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 475, in call
2024-11-12 08:02:57 do = self.iter(retry_state=retry_state)
2024-11-12 08:02:57 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-11-12 08:02:57 File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 376, in iter
2024-11-12 08:02:57 result = action(retry_state)
2024-11-12 08:02:57 ^^^^^^^^^^^^^^^^^^^
2024-11-12 08:02:57 File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 398, in
2024-11-12 08:02:57 self._add_action_func(lambda rs: rs.outcome.result())
2024-11-12 08:02:57 ^^^^^^^^^^^^^^^^^^^
2024-11-12 08:02:57 File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 449, in result
2024-11-12 08:02:57 return self.__get_result()
2024-11-12 08:02:57 ^^^^^^^^^^^^^^^^^^^
2024-11-12 08:02:57 File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
2024-11-12 08:02:57 raise self._exception
2024-11-12 08:02:57 File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 478, in call
2024-11-12 08:02:57 result = fn(*args, **kwargs)
2024-11-12 08:02:57 ^^^^^^^^^^^^^^^^^^^
2024-11-12 08:02:57 File "/app/openhands/llm/llm.py", line 196, in wrapper
2024-11-12 08:02:57 resp: ModelResponse = completion_unwrapped(*args, **kwargs)
2024-11-12 08:02:57 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-11-12 08:02:57 File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 1013, in wrapper
2024-11-12 08:02:57 raise e
2024-11-12 08:02:57 File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 903, in wrapper
2024-11-12 08:02:57 result = original_function(*args, **kwargs)
2024-11-12 08:02:57 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-11-12 08:02:57 File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 2999, in completion
2024-11-12 08:02:57 raise exception_type(
2024-11-12 08:02:57 ^^^^^^^^^^^^^^^
2024-11-12 08:02:57 File "/app/.venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2116, in exception_type
2024-11-12 08:02:57 raise e
2024-11-12 08:02:57 File "/app/.venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 1979, in exception_type
2024-11-12 08:02:57 raise BadRequestError(
2024-11-12 08:02:57 litellm.exceptions.BadRequestError: litellm.BadRequestError: OpenrouterException -
2024-11-12 08:02:57 07:02:57 - openhands:ERROR: agent_controller.py:135 - [Agent Controller 8e811def-d626-466f-8dd1-ed1db210ab92] Error while running the agent: litellm.BadRequestError: OpenrouterException -
2024-11-12 08:02:57 07:02:57 - openhands:ERROR: agent_controller.py:135 - [Agent Controller 8e811def-d626-466f-8dd1-ed1db210ab92] Traceback (most recent call last):
2024-11-12 08:02:57 File "/app/.venv/lib/python3.12/site-packages/litellm/llms/OpenAI/openai.py", line 854, in completion
2024-11-12 08:02:57 raise e
2024-11-12 08:02:57 File "/app/.venv/lib/python3.12/site-packages/litellm/llms/OpenAI/openai.py", line 805, in completion
2024-11-12 08:02:57 return convert_to_model_response_object(
2024-11-12 08:02:57 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-11-12 08:02:57 File "/app/.venv/lib/python3.12/site-packages/litellm/litellm_core_utils/llm_response_utils/convert_dict_to_response.py", line 289, in convert_to_model_response_object
2024-11-12 08:02:57 raise raised_exception
2024-11-12 08:02:57 Exception
2024-11-12 08:02:57
2024-11-12 08:02:57 During handling of the above exception, another exception occurred:
2024-11-12 08:02:57
2024-11-12 08:02:57 Traceback (most recent call last):
2024-11-12 08:02:57 File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 2168, in completion
2024-11-12 08:02:57 response = openai_chat_completions.completion(
2024-11-12 08:02:57 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-11-12 08:02:57 File "/app/.venv/lib/python3.12/site-packages/litellm/llms/OpenAI/openai.py", line 864, in completion
2024-11-12 08:02:57 raise OpenAIError(
2024-11-12 08:02:57 litellm.llms.OpenAI.openai.OpenAIError
2024-11-12 08:02:57
2024-11-12 08:02:57 During handling of the above exception, another exception occurred:
2024-11-12 08:02:57
2024-11-12 08:02:57 Traceback (most recent call last):
2024-11-12 08:02:57 File "/app/openhands/controller/agent_controller.py", line 168, in start_step_loop
2024-11-12 08:02:57 await self._step()
2024-11-12 08:02:57 File "/app/openhands/controller/agent_controller.py", line 464, in _step
2024-11-12 08:02:57 action = self.agent.step(self.state)
2024-11-12 08:02:57 ^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-11-12 08:02:57 File "/app/openhands/agenthub/codeact_agent/codeact_agent.py", line 359, in step
2024-11-12 08:02:57 response = self.llm.completion(**params)
2024-11-12 08:02:57 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-11-12 08:02:57 File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 336, in wrapped_f
2024-11-12 08:02:57 return copy(f, *args, **kw)
2024-11-12 08:02:57 ^^^^^^^^^^^^^^^^^^^^
2024-11-12 08:02:57 File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 475, in call
2024-11-12 08:02:57 do = self.iter(retry_state=retry_state)
2024-11-12 08:02:57 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-11-12 08:02:57 File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 376, in iter
2024-11-12 08:02:57 result = action(retry_state)
2024-11-12 08:02:57 ^^^^^^^^^^^^^^^^^^^
2024-11-12 08:02:57 File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 398, in
2024-11-12 08:02:57 self._add_action_func(lambda rs: rs.outcome.result())
2024-11-12 08:02:57 ^^^^^^^^^^^^^^^^^^^
2024-11-12 08:02:57 File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 449, in result
2024-11-12 08:02:57 return self.__get_result()
2024-11-12 08:02:57 ^^^^^^^^^^^^^^^^^^^
2024-11-12 08:02:57 File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
2024-11-12 08:02:57 raise self._exception
2024-11-12 08:02:57 File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 478, in call
2024-11-12 08:02:57 result = fn(*args, **kwargs)
2024-11-12 08:02:57 ^^^^^^^^^^^^^^^^^^^
2024-11-12 08:02:57 File "/app/openhands/llm/llm.py", line 196, in wrapper
2024-11-12 08:02:57 resp: ModelResponse = completion_unwrapped(*args, **kwargs)
2024-11-12 08:02:57 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-11-12 08:02:57 File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 1013, in wrapper
2024-11-12 08:02:57 raise e
2024-11-12 08:02:57 File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 903, in wrapper
2024-11-12 08:02:57 result = original_function(*args, **kwargs)
2024-11-12 08:02:57 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-11-12 08:02:57 File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 2999, in completion
2024-11-12 08:02:57 raise exception_type(
2024-11-12 08:02:57 ^^^^^^^^^^^^^^^
2024-11-12 08:02:57 File "/app/.venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2116, in exception_type
2024-11-12 08:02:57 raise e
2024-11-12 08:02:57 File "/app/.venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 1979, in exception_type
2024-11-12 08:02:57 raise BadRequestError(
2024-11-12 08:02:57 litellm.exceptions.BadRequestError: litellm.BadRequestError: OpenrouterException -

And here are my settings
openhands1

@shaxxx
Copy link
Author

shaxxx commented Nov 12, 2024

Tried with 0.13 version, only difference is now agent informs me in chat window that there was an error.
I even tried with alternative settings, other model work, gpt-4o-mini doesn't
openhands2
Cline works without any problems with same model.

@mamoodi mamoodi added the severity:medium Affecting multiple users label Nov 12, 2024
@mamoodi
Copy link
Collaborator

mamoodi commented Nov 12, 2024

Thanks very much for bringing this issue up. Can you try the main version as well and see if it works there?

docker run -it --pull=always \
    -e SANDBOX_RUNTIME_CONTAINER_IMAGE=docker.all-hands.dev/all-hands-ai/runtime:main-nikolaik \
    -v /var/run/docker.sock:/var/run/docker.sock \
    -p 3000:3000 \
    -e LOG_ALL_EVENTS=true \
    --add-host host.docker.internal:host-gateway \
    --name openhands-app \
    docker.all-hands.dev/all-hands-ai/openhands:main

There was a change that added some support for other models and I'd like to see if it works here too.

@shaxxx
Copy link
Author

shaxxx commented Nov 12, 2024

I had to modified the startup code to properly escape it for bash shell in docker and added few extra parameters

docker run -it --pull=always \
    -e SANDBOX_RUNTIME_CONTAINER_IMAGE=docker.all-hands.dev/all-hands-ai/runtime:main-nikolaik   \
    -e SANDBOX_USER_ID=$(id -u) \
    -e WORKSPACE_MOUNT_PATH=$WORKSPACE_BASE \
    -e LOG_ALL_EVENTS=true \
	-e DEBUG=1 \
    -e SANDBOX_TIMEOUT=120 \
    -v $WORKSPACE_BASE:/opt/workspace_base \
    -v //var/run/docker.sock:/var/run/docker.sock \
    -p 3000:3000 \
    --add-host host.docker.internal:host-gateway \
    --name openhands-app-$(date +%Y%m%d%H%M%S) \
    docker.all-hands.dev/all-hands-ai/openhands:main

here's console log
https://pastebin.com/raw/wiwCwfgn

here's openhands log
https://pastebin.com/raw/j75riK9F

Still not working.

@mamoodi
Copy link
Collaborator

mamoodi commented Nov 12, 2024

Yeah something is up with this model. I see this in the logs, not sure if this is what causes the error:

|14:38:18 - openhands:DEBUG: action_execution_server.py:167 - Action output:
|**ErrorObservation**
|File not found: /workspace/.gitignore. Your current working directory is /workspace.

Let me see if I can get someone to take a look at this.

@shaxxx
Copy link
Author

shaxxx commented Nov 12, 2024

It's not, I tried creating valid .gitignore file, still fails.
But setting litellm outpu to verbose is giving the real error

Invalid parameter: messages with role 'tool' must be a response to a preceeding message with 'tool_calls'

2024-11-12 17:30:39 RAW RESPONSE:
2024-11-12 17:30:39 {"id": null, "choices": null, "created": null, "model": null, "object": null, "service_tier": null, "system_fingerprint": null, "usage": null, "error": {"message": "Provider returned error", "code": 400, "metadata": {"raw": "{\n "error": {\n "message": "Invalid parameter: messages with role 'tool' must be a response to a preceeding message with 'tool_calls'.",\n "type": "invalid_request_error",\n "param": "messages.[3].role",\n "code": null\n }\n}", "provider_name": "OpenAI"}}}
2024-11-12 17:30:39
2024-11-12 17:30:39
2024-11-12 17:30:39 openai.py: Received openai error -
2024-11-12 17:30:39 RAW RESPONSE:
2024-11-12 17:30:39
2024-11-12 17:30:39
2024-11-12 17:30:39
2024-11-12 17:30:39
2024-11-12 17:30:39 Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
2024-11-12 17:30:39 LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.
2024-11-12 17:30:39
2024-11-12 17:30:39 Traceback (most recent call last):
2024-11-12 17:30:39 File "/app/.venv/lib/python3.12/site-packages/litellm/llms/OpenAI/openai.py", line 854, in completion
2024-11-12 17:30:39 raise e
2024-11-12 17:30:39 File "/app/.venv/lib/python3.12/site-packages/litellm/llms/OpenAI/openai.py", line 805, in completion
2024-11-12 17:30:39 return convert_to_model_response_object(
2024-11-12 17:30:39 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-11-12 17:30:39 File "/app/.venv/lib/python3.12/site-packages/litellm/litellm_core_utils/llm_response_utils/convert_dict_to_response.py", line 366, in convert_to_model_response_object
2024-11-12 17:30:39 raise raised_exception
2024-11-12 17:30:39 Exception
2024-11-12 17:30:39

@mamoodi
Copy link
Collaborator

mamoodi commented Nov 12, 2024

Apologies for the ping @xingyaoww but user seems to have an issue with GPT-4o-mini specifically. Is it that the model doesn't support something?

@enyst
Copy link
Collaborator

enyst commented Nov 12, 2024

@shaxxx In your last screenshot, it shows o1-mini, not got-4o-mini. Is o1-mini the one that doesn't work for you?

With the current main, I don't seem to reproduce the original issue with 4o, gpt-4o-mini works.

@shaxxx
Copy link
Author

shaxxx commented Nov 12, 2024

@shaxxx In your last screenshot, it shows o1-mini, not got-4o-mini. Is o1-mini the one that doesn't work for you?

With the current main, I don't seem to reproduce the original issue with 4o, gpt-4o-mini works.

I apologize for mixing up screenshots.
o1-mini works
gpt-4o-mini doesn't work

@shaxxx
Copy link
Author

shaxxx commented Nov 12, 2024

I've deleted all the containers, recreated them from main (there were some changes since docker needed to re-download image parts), build new containers without workspace bindings and still the same.
Here's the clean log output for build todo app with vue prompt with litellm.set_verbose=True and all openhands debugging on.
This is as good as it gets.
https://pastebin.com/raw/LZu3k7xz

@enyst
Copy link
Collaborator

enyst commented Nov 12, 2024

@shaxxx You're right, I can reproduce it now. Can you please for now, add -e AGENT_FUNCTION_CALLING=false to the docker command in the README?

@enyst enyst self-assigned this Nov 13, 2024
@shaxxx
Copy link
Author

shaxxx commented Nov 13, 2024

@shaxxx You're right, I can reproduce it now. Can you please for now, add -e AGENT_FUNCTION_CALLING=false to the docker command in the README?

I can confirm it doesn't crash with additional settings.
Well, kind of, vue todo app prompt did throw error

500 Server Error: Internal Server Error for url: http://host.docker.internal:31691/execute_action

but it was working for a while normally making me think this is another kind of (unrelated) error with container dependencies/setup and can be ignored.

@shaxxx
Copy link
Author

shaxxx commented Dec 2, 2024

Still an issue with 0.14 version

@shaxxx
Copy link
Author

shaxxx commented Dec 18, 2024

Still an Isuse with 0.15 version

Copy link
Contributor

This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 7 days.

@github-actions github-actions bot added the Stale Inactive for 30 days label Jan 26, 2025
Copy link
Contributor

github-actions bot commented Feb 2, 2025

This issue was closed because it has been stalled for over 30 days with no activity.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Feb 2, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working severity:medium Affecting multiple users Stale Inactive for 30 days
Projects
None yet
Development

No branches or pull requests

3 participants