Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Using OpenAI o1-mini - 'The model produced invalid content.' #274

Open
kevin-support-bot bot opened this issue Feb 19, 2025 · 36 comments
Open

Comments

@kevin-support-bot
Copy link

All-Hands-AI#6808 Issue


@PoisonedPorkchop, Would you apply this commit and retry that request?

@PoisonedPorkchop
Copy link

Running the app...
20:06:33 - openhands:INFO: llm.py:162 - self.config.model='claude-3-5-sonnet-20241022'
20:06:33 - openhands:INFO: llm.py:163 - self.config.max_input_tokens=4096
20:06:33 - openhands:INFO: llm.py:164 - self.config.max_output_tokens=8192
20:06:34 - openhands:INFO: openhands_config.py:48 - Using config class None
INFO: Started server process [5743]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://127.0.0.1:3000 (Press CTRL+C to quit)
INFO: ('127.0.0.1', 58092) - "WebSocket /socket.io/?latest_event_id=0&conversation_id=32cfee6ef4794f75960ceaf6ddea3672&EIO=4&transport=websocket" [accepted]
INFO: connection open
20:06:39 - openhands:INFO: listen_socket.py:24 - sio:connect: 4LuRqCdBdxUYVSgkAAAB
20:06:39 - openhands:INFO: manager.py:225 - join_conversation:32cfee6ef4794f75960ceaf6ddea3672:4LuRqCdBdxUYVSgkAAAB
20:06:39 - openhands:INFO: manager.py:478 - _get_event_stream:32cfee6ef4794f75960ceaf6ddea3672
20:06:39 - openhands:INFO: manager.py:451 - maybe_start_agent_loop:32cfee6ef4794f75960ceaf6ddea3672
20:06:39 - openhands:INFO: manager.py:454 - start_agent_loop:32cfee6ef4794f75960ceaf6ddea3672
20:06:39 - openhands:INFO: manager.py:478 - _get_event_stream:32cfee6ef4794f75960ceaf6ddea3672
20:06:39 - openhands:INFO: manager.py:481 - found_local_agent_loop:32cfee6ef4794f75960ceaf6ddea3672
20:06:39 - openhands:INFO: llm.py:162 - self.config.model='openai/o1-mini'
20:06:39 - openhands:INFO: llm.py:163 - self.config.max_input_tokens=4096
20:06:39 - openhands:INFO: llm.py:164 - self.config.max_output_tokens=65536
20:06:39 - openhands:INFO: codeact_agent.py:103 - Function calling not enabled for model openai/o1-mini. Mocking function calling via prompting.
20:06:40 - openhands:INFO: docker_runtime.py:144 - Creating new Docker container
20:06:42 - openhands:INFO: runtime_build.py:182 - Building image: ghcr.io/all-hands-ai/runtime:oh_v0.20.0_24i4soelwunbwbu3_opq8f5zt0ctror1d
20:06:42 - openhands:INFO: docker_runtime.py:160 - Starting runtime with image: ghcr.io/all-hands-ai/runtime:oh_v0.20.0_24i4soelwunbwbu3_opq8f5zt0ctror1d
20:06:43 - openhands:INFO: docker_runtime.py:164 - Container started: kevin-runtime-persisted-oh-32cfee6ef4794f75960ceaf6ddea3672. VSCode URL: None
20:06:43 - openhands:INFO: docker_runtime.py:187 - Waiting for client to become ready at http://localhost:63725...
20:06:43 - openhands:INFO: action_execution_client.py:103 - Checking if runtime is alive
20:06:45 - openhands:INFO: action_execution_client.py:103 - Checking if runtime is alive
20:06:47 - openhands:INFO: action_execution_client.py:103 - Checking if runtime is alive
20:06:49 - openhands:INFO: action_execution_client.py:103 - Checking if runtime is alive
20:06:51 - openhands:INFO: action_execution_client.py:103 - Checking if runtime is alive
20:06:53 - openhands:INFO: action_execution_client.py:103 - Checking if runtime is alive
20:06:55 - openhands:INFO: action_execution_client.py:103 - Checking if runtime is alive
20:06:57 - openhands:INFO: action_execution_client.py:103 - Checking if runtime is alive
20:06:59 - openhands:INFO: action_execution_client.py:103 - Checking if runtime is alive
20:07:01 - openhands:INFO: action_execution_client.py:103 - Checking if runtime is alive
20:07:03 - openhands:INFO: action_execution_client.py:103 - Checking if runtime is alive
20:07:05 - openhands:INFO: action_execution_client.py:103 - Checking if runtime is alive
20:07:07 - openhands:INFO: action_execution_client.py:103 - Checking if runtime is alive
20:07:09 - openhands:INFO: action_execution_client.py:103 - Checking if runtime is alive
20:07:09 - openhands:INFO: docker_runtime.py:202 - Runtime is ready.
20:07:09 - openhands:INFO: docker_runtime.py:220 - Copied selenium files to runtime
20:07:09 - openhands:INFO: base.py:247 - Selected repo: None, loading microagents from /workspace/.openhands/microagents (inside runtime)
20:07:09 - openhands:INFO: agent_controller.py:486 - Setting agent(CodeActAgent) state from AgentState.LOADING to AgentState.INIT
20:07:09 - OBSERVATION
AgentStateChangedObservation(content='', agent_state=<AgentState.INIT: 'init'>, observation='agent_state_changed')
20:07:09 - USER_ACTION
MessageAction (source=EventSource.USER)
CONTENT: hello
20:07:09 - openhands:INFO: agent_controller.py:486 - Setting agent(CodeActAgent) state from AgentState.INIT to AgentState.RUNNING
CodeActAgent LEVEL 0 LOCAL STEP 1 GLOBAL STEP 1
20:07:09 - openhands:INFO: llm.py:831 - Token count: 2640
20:07:09 - openhands:INFO: docker_runtime.py:170 - Using existing Docker container: kevin-runtime-persisted-oh-32cfee6ef4794f75960ceaf6ddea3672
20:07:09 - openhands:INFO: docker_runtime.py:412 - Container status: running
20:07:09 - openhands:INFO: action_execution_client.py:103 - Checking if runtime is alive
20:07:09 - openhands:INFO: docker_runtime.py:220 - Copied selenium files to runtime
INFO: 127.0.0.1:39048 - "GET /api/conversations/32cfee6ef4794f75960ceaf6ddea3672/vscode-url HTTP/1.1" 200 OK
INFO: 127.0.0.1:39042 - "GET /api/conversations/32cfee6ef4794f75960ceaf6ddea3672/list-files HTTP/1.1" 200 OK
20:07:48 - OBSERVATION
AgentStateChangedObservation(content='', agent_state=<AgentState.RUNNING: 'running'>, observation='agent_state_changed')
20:07:48 - openhands:INFO: agent_controller.py:486 - Setting agent(CodeActAgent) state from AgentState.RUNNING to AgentState.FINISHED
20:07:48 - OBSERVATION
AgentStateChangedObservation(content='', agent_state=<AgentState.FINISHED: 'finished'>, observation='agent_state_changed')
20:07:49 - openhands:INFO: docker_runtime.py:170 - Using existing Docker container: kevin-runtime-persisted-oh-32cfee6ef4794f75960ceaf6ddea3672
20:07:49 - openhands:INFO: docker_runtime.py:412 - Container status: running
20:07:49 - openhands:INFO: action_execution_client.py:103 - Checking if runtime is alive
20:07:49 - openhands:INFO: docker_runtime.py:220 - Copied selenium files to runtime
INFO: 127.0.0.1:39052 - "GET /api/conversations/32cfee6ef4794f75960ceaf6ddea3672/list-files HTTP/1.1" 200 OK
INFO: 127.0.0.1:39052 - "GET /beep.wav HTTP/1.1" 206 Partial Content
INFO: 127.0.0.1:39052 - "GET /favicon.ico HTTP/1.1" 200 OK
20:10:37 - openhands:INFO: docker_runtime.py:170 - Using existing Docker container: kevin-runtime-persisted-oh-32cfee6ef4794f75960ceaf6ddea3672
INFO: 127.0.0.1:60466 - "GET /assets/index-BIwqRepL.js HTTP/1.1" 304 Not Modified
20:10:37 - openhands:INFO: docker_runtime.py:412 - Container status: running
20:10:37 - openhands:INFO: action_execution_client.py:103 - Checking if runtime is alive
20:10:37 - openhands:INFO: docker_runtime.py:220 - Copied selenium files to runtime
INFO: 127.0.0.1:60450 - "GET /api/conversations/32cfee6ef4794f75960ceaf6ddea3672/list-files HTTP/1.1" 200 OK
20:10:46 - USER_ACTION
MessageAction (source=EventSource.USER)
CONTENT: read and execute yes.txt please
20:10:46 - openhands:INFO: agent_controller.py:486 - Setting agent(CodeActAgent) state from AgentState.FINISHED to AgentState.RUNNING
CodeActAgent LEVEL 0 LOCAL STEP 2 GLOBAL STEP 2
20:10:46 - openhands:INFO: llm.py:831 - Token count: 2460
INFO: 127.0.0.1:60482 - "GET /api/conversations/32cfee6ef4794f75960ceaf6ddea3672/list-files HTTP/1.1" 200 OK
litellm.APIError: APIError: OpenAIException - Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}
20:11:15 - openhands:ERROR: retry_mixin.py:62 - Error: Expecting value: line 1 column 1 (char 0)
20:11:15 - openhands:ERROR: retry_mixin.py:67 - litellm.APIError: APIError: OpenAIException - Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}. Attempt #1 | You can customize retry values in the configuration.
20:11:30 - openhands:INFO: llm.py:831 - Token count: 2460
litellm.APIError: APIError: OpenAIException - Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}
20:12:01 - openhands:ERROR: retry_mixin.py:62 - Error: Expecting value: line 1 column 1 (char 0)
20:12:01 - openhands:ERROR: retry_mixin.py:67 - litellm.APIError: APIError: OpenAIException - Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}. Attempt #2 | You can customize retry values in the configuration.
20:12:16 - openhands:INFO: llm.py:831 - Token count: 2460
litellm.APIError: APIError: OpenAIException - Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}
20:12:52 - openhands:ERROR: retry_mixin.py:62 - Error: Expecting value: line 1 column 1 (char 0)
20:12:52 - openhands:ERROR: retry_mixin.py:67 - litellm.APIError: APIError: OpenAIException - Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}. Attempt #3 | You can customize retry values in the configuration.
20:13:07 - openhands:INFO: llm.py:831 - Token count: 2460
litellm.APIError: APIError: OpenAIException - Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}
20:13:42 - openhands:ERROR: retry_mixin.py:62 - Error: Expecting value: line 1 column 1 (char 0)
20:13:42 - openhands:ERROR: retry_mixin.py:67 - litellm.APIError: APIError: OpenAIException - Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}. Attempt #4 | You can customize retry values in the configuration.
20:13:58 - openhands:INFO: llm.py:831 - Token count: 2460
litellm.APIError: APIError: OpenAIException - Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}
20:14:25 - openhands:ERROR: retry_mixin.py:62 - Error: Expecting value: line 1 column 1 (char 0)
20:14:25 - openhands:ERROR: retry_mixin.py:67 - litellm.APIError: APIError: OpenAIException - Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}. Attempt #5 | You can customize retry values in the configuration.
20:14:57 - openhands:INFO: llm.py:831 - Token count: 2460
litellm.APIError: APIError: OpenAIException - Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}
20:15:16 - openhands:ERROR: retry_mixin.py:62 - Error: Expecting value: line 1 column 1 (char 0)
20:15:16 - openhands:ERROR: retry_mixin.py:67 - litellm.APIError: APIError: OpenAIException - Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}. Attempt #6 | You can customize retry values in the configuration.
20:16:20 - openhands:INFO: llm.py:831 - Token count: 2460
litellm.APIError: APIError: OpenAIException - Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}
20:16:32 - openhands:ERROR: retry_mixin.py:62 - Error: Expecting value: line 1 column 1 (char 0)
20:16:32 - openhands:ERROR: retry_mixin.py:67 - litellm.APIError: APIError: OpenAIException - Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}. Attempt #7 | You can customize retry values in the configuration.
20:18:32 - openhands:INFO: llm.py:831 - Token count: 2460
20:19:03 - openhands:ERROR: agent_controller.py:233 - Error while running the agent (session ID: 32cfee6ef4794f75960ceaf6ddea3672): litellm.APIError: APIError: OpenAIException - Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}. Traceback: Traceback (most recent call last):
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 691, in completion
raise e
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 619, in completion
self.make_sync_openai_chat_completion_request(
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 439, in make_sync_openai_chat_completion_request
raise e
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 421, in make_sync_openai_chat_completion_request
raw_response = openai_client.chat.completions.with_raw_response.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_legacy_response.py", line 356, in wrapped
return cast(LegacyAPIResponse[R], func(*args, **kwargs))
^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_utils/_utils.py", line 279, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/resources/chat/completions.py", line 859, in create
return self._post(
^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1283, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 960, in request
return self._request(
^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1049, in _request
return self._retry_request(
^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1098, in _retry_request
return self._request(
^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1049, in _request
return self._retry_request(
^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1098, in _retry_request
return self._request(
^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1064, in _request
raise self._make_status_error_from_response(err.response) from None
openai.InternalServerError: Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/main.py", line 1612, in completion
raise e
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/main.py", line 1585, in completion
response = openai_chat_completions.completion(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 701, in completion
raise OpenAIError(
litellm.llms.openai.common_utils.OpenAIError: Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/mnt/f/OpenHands/Kevin/openhands/controller/agent_controller.py", line 231, in _step_with_exception_handling
await self._step()
File "/mnt/f/OpenHands/Kevin/openhands/controller/agent_controller.py", line 710, in _step
action = self.agent.step(self.state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/f/OpenHands/Kevin/openhands/agenthub/codeact_agent/codeact_agent.py", line 486, in step
response = self.llm.completion(**params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 336, in wrapped_f
return copy(f, *args, **kw)
^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 475, in call
do = self.iter(retry_state=retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 376, in iter
result = action(retry_state)
^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 418, in exc_check
raise retry_exc.reraise()
^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 185, in reraise
raise self.last_attempt.result()
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/concurrent/futures/_base.py", line 449, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 478, in call
result = fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/mnt/f/OpenHands/Kevin/openhands/llm/llm.py", line 377, in wrapper
resp = self._completion_unwrapped(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/utils.py", line 1030, in wrapper
raise e
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/utils.py", line 906, in wrapper
result = original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/main.py", line 2967, in completion
raise exception_type(
^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2189, in exception_type
raise e
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 449, in exception_type
raise APIError(
litellm.exceptions.APIError: litellm.APIError: APIError: OpenAIException - Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}

20:19:03 - openhands:INFO: agent_controller.py:486 - Setting agent(CodeActAgent) state from AgentState.RUNNING to AgentState.ERROR
20:19:03 - OBSERVATION
AgentStateChangedObservation(content='', agent_state=<AgentState.RUNNING: 'running'>, observation='agent_state_changed')
20:19:03 - OBSERVATION
AgentStateChangedObservation(content='', agent_state=<AgentState.ERROR: 'error'>, observation='agent_state_changed')
20:19:03 - OBSERVATION
ErrorObservation
RuntimeError: There was an unexpected error while running the agent. Please report this error to the developers. Your session ID is 32cfee6ef4794f75960ceaf6ddea3672. Error type: APIError

@SmartManoj
Copy link
Owner

SmartManoj commented Feb 20, 2025

The parameters are stored in logs/llm/request.json. Would you retry that using the following script?

import warnings
import json
import os
with warnings.catch_warnings():
    warnings.simplefilter('ignore')
    import litellm

model = 'o1-mini'
os.environ["OPENAI_API_KEY"] = "your-api-key"

args, kwargs = json.load(open("logs/llm/request.json"))

res = litellm.completion(model=model, *args, **kwargs)
print(res)

@PoisonedPorkchop
Copy link

I'm assuming I should do that function it tells me to do or no?
poisonedporkchop@PORK:/mnt/f/OpenHands/Kevin$ python3 test.py

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm._turn_on_debug()'.

Traceback (most recent call last):
File "/home/poisonedporkchop/.local/lib/python3.10/site-packages/litellm/llms/openai/openai.py", line 726, in completion
raise e
File "/home/poisonedporkchop/.local/lib/python3.10/site-packages/litellm/llms/openai/openai.py", line 653, in completion
self.make_sync_openai_chat_completion_request(
File "/home/poisonedporkchop/.local/lib/python3.10/site-packages/litellm/litellm_core_utils/logging_utils.py", line 145, in sync_wrapper
result = func(*args, **kwargs)
File "/home/poisonedporkchop/.local/lib/python3.10/site-packages/litellm/llms/openai/openai.py", line 472, in make_sync_openai_chat_completion_request
raise e
File "/home/poisonedporkchop/.local/lib/python3.10/site-packages/litellm/llms/openai/openai.py", line 454, in make_sync_openai_chat_completion_request
raw_response = openai_client.chat.completions.with_raw_response.create(
File "/home/poisonedporkchop/.local/lib/python3.10/site-packages/openai/_legacy_response.py", line 364, in wrapped
return cast(LegacyAPIResponse[R], func(*args, **kwargs))
File "/home/poisonedporkchop/.local/lib/python3.10/site-packages/openai/_utils/_utils.py", line 279, in wrapper
return func(*args, **kwargs)
File "/home/poisonedporkchop/.local/lib/python3.10/site-packages/openai/resources/chat/completions/completions.py", line 879, in create
return self._post(
File "/home/poisonedporkchop/.local/lib/python3.10/site-packages/openai/_base_client.py", line 1290, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
File "/home/poisonedporkchop/.local/lib/python3.10/site-packages/openai/_base_client.py", line 967, in request
return self._request(
File "/home/poisonedporkchop/.local/lib/python3.10/site-packages/openai/_base_client.py", line 1056, in _request
return self._retry_request(
File "/home/poisonedporkchop/.local/lib/python3.10/site-packages/openai/_base_client.py", line 1105, in _retry_request
return self._request(
File "/home/poisonedporkchop/.local/lib/python3.10/site-packages/openai/_base_client.py", line 1056, in _request
return self._retry_request(
File "/home/poisonedporkchop/.local/lib/python3.10/site-packages/openai/_base_client.py", line 1105, in _retry_request
return self._request(
File "/home/poisonedporkchop/.local/lib/python3.10/site-packages/openai/_base_client.py", line 1071, in _request
raise self._make_status_error_from_response(err.response) from None
openai.InternalServerError: Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/poisonedporkchop/.local/lib/python3.10/site-packages/litellm/main.py", line 1724, in completion
raise e
File "/home/poisonedporkchop/.local/lib/python3.10/site-packages/litellm/main.py", line 1697, in completion
response = openai_chat_completions.completion(
File "/home/poisonedporkchop/.local/lib/python3.10/site-packages/litellm/llms/openai/openai.py", line 736, in completion
raise OpenAIError(
litellm.llms.openai.common_utils.OpenAIError: Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/mnt/f/OpenHands/Kevin/test.py", line 13, in
res = litellm.completion(model=model, *args, **kwargs)
File "/home/poisonedporkchop/.local/lib/python3.10/site-packages/litellm/utils.py", line 1190, in wrapper
raise e
File "/home/poisonedporkchop/.local/lib/python3.10/site-packages/litellm/utils.py", line 1068, in wrapper
result = original_function(*args, **kwargs)
File "/home/poisonedporkchop/.local/lib/python3.10/site-packages/litellm/main.py", line 3086, in completion
raise exception_type(
File "/home/poisonedporkchop/.local/lib/python3.10/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2202, in exception_type
raise e
File "/home/poisonedporkchop/.local/lib/python3.10/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 452, in exception_type
raise APIError(
litellm.exceptions.APIError: litellm.APIError: APIError: OpenAIException - Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}
poisonedporkchop@PORK:/mnt/f/OpenHands/Kevin$

@SmartManoj
Copy link
Owner

Does adding a new assistant message with the content "DO NOT PRODUCE INVALID CONTENT" work?

https://x.com/martolini/status/1790340435799335360

@PoisonedPorkchop
Copy link

python3 test.py
ModelResponse(id='chatcmpl-B2t4ZO6LY8BRJQ45wWeCLva7eda52', created=1740028055, model='o1-mini-2024-09-12', object='chat.completion', system_fingerprint='fp_fc2f147b27', choices=[Choices(finish_reason='stop', index=0, message=Message(content='Hello! How can I assist you with your coding today?', role='assistant', tool_calls=None, function_call=None, provider_specific_fields={'refusal': None}, refusal=None))], usage=Usage(completion_tokens=408, prompt_tokens=2261, total_tokens=2669, completion_tokens_details=CompletionTokensDetailsWrapper(accepted_prediction_tokens=0, audio_tokens=0, reasoning_tokens=384, rejected_prediction_tokens=0, text_tokens=None), prompt_tokens_details=PromptTokensDetailsWrapper(audio_tokens=0, cached_tokens=1408, text_tokens=None, image_tokens=None)), service_tier='default')

@SmartManoj SmartManoj marked this as a duplicate of #239 Feb 20, 2025
@SmartManoj
Copy link
Owner

res = litellm.completion(model=model, *args, **kwargs)

Would you pass the seed too?

res = litellm.completion(model=model, seed=42, *args, **kwargs)

@PoisonedPorkchop
Copy link

Yeah I just added the seed too and it worked for me as well. However, when I boot up the app it replaces the request.json with a fresh one so all my changes are overwritten. Do you know how I could solve that?

@SmartManoj
Copy link
Owner

SmartManoj commented Feb 20, 2025

You need to pull again. With this commit, the changes will be automatically added.

Kevin/openhands/llm/llm.py

Lines 267 to 268 in a610a2e

if self.config.model.startswith('o1-mini'):
kwargs['messages'].append(Message(role='assistant', content=[TextContent(text='DO NOT PRODUCE INVALID CONTENT')]))

@PoisonedPorkchop
Copy link

Though the code is indeed updated and I built it again, the request.json seems to remain unchanged (maybe this is intentional) and I get the error 500 still.
Running the app...
22:37:30 - openhands:INFO: server_config.py:39 - Using config class None
INFO: Started server process [8521]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://127.0.0.1:3000 (Press CTRL+C to quit)
INFO: 127.0.0.1:50134 - "GET /api/options/config HTTP/1.1" 200 OK
INFO: 127.0.0.1:50136 - "GET /api/settings HTTP/1.1" 200 OK
INFO: 127.0.0.1:50144 - "GET /api/github/user HTTP/1.1" 200 OK
INFO: 127.0.0.1:50144 - "GET / HTTP/1.1" 200 OK
INFO: 127.0.0.1:50144 - "GET /favicon.ico HTTP/1.1" 200 OK
INFO: 127.0.0.1:50136 - "GET /locales/en/translation.json HTTP/1.1" 200 OK
INFO: 127.0.0.1:50134 - "GET /api/options/config HTTP/1.1" 200 OK
INFO: 127.0.0.1:50160 - "GET /api/settings HTTP/1.1" 200 OK
INFO: 127.0.0.1:50144 - "GET /api/settings HTTP/1.1" 200 OK
INFO: 127.0.0.1:50144 - "GET /logo.png HTTP/1.1" 200 OK
INFO: 127.0.0.1:50136 - "GET /api/github/user HTTP/1.1" 200 OK
INFO: 127.0.0.1:50160 - "GET /api/github/repositories?sort=pushed&page=1&per_page=100 HTTP/1.1" 200 OK
22:38:29 - openhands:INFO: manage_conversations.py:136 - Initializing new conversation
22:38:29 - openhands:INFO: manage_conversations.py:54 - Loading settings
22:38:29 - openhands:INFO: manage_conversations.py:57 - Settings loaded
22:38:29 - openhands:INFO: manage_conversations.py:81 - Loading conversation store
22:38:29 - openhands:INFO: manage_conversations.py:83 - Conversation store loaded
22:38:29 - openhands:INFO: manage_conversations.py:89 - New conversation ID: 6e538aff1fb248408e9034bc7e100d0e
22:38:29 - openhands:INFO: manage_conversations.py:96 - Saving metadata for conversation 6e538aff1fb248408e9034bc7e100d0e
22:38:29 - openhands:INFO: manage_conversations.py:107 - Starting agent loop for conversation 6e538aff1fb248408e9034bc7e100d0e
22:38:29 - openhands:INFO: standalone_conversation_manager.py:192 - maybe_start_agent_loop:6e538aff1fb248408e9034bc7e100d0e
22:38:29 - openhands:INFO: standalone_conversation_manager.py:195 - start_agent_loop:6e538aff1fb248408e9034bc7e100d0e
22:38:29 - openhands:INFO: standalone_conversation_manager.py:222 - _get_event_stream:6e538aff1fb248408e9034bc7e100d0e
22:38:29 - openhands:INFO: standalone_conversation_manager.py:225 - found_local_agent_loop:6e538aff1fb248408e9034bc7e100d0e
22:38:29 - openhands:INFO: manage_conversations.py:125 - Finished initializing conversation 6e538aff1fb248408e9034bc7e100d0e
22:38:29 - openhands:INFO: llm.py:167 - self.config.model='openai/o1-mini'
22:38:29 - openhands:INFO: llm.py:168 - self.config.max_input_tokens=128000
22:38:29 - openhands:INFO: llm.py:169 - self.config.max_output_tokens=65536
22:38:29 - openhands:INFO: session.py:127 - Enabling default condenser: type='llm' llm_config=LLMConfig(use_group=None, enable_cache=False, seed=42, model='openai/o1-mini', api_key=SecretStr('**********'), base_url='', api_version=None, embedding_model='local', embedding_base_url=None, embedding_deployment_name=None, aws_access_key_id=None, aws_secret_access_key=None, aws_region_name=None, openrouter_site_url='https://docs.all-hands.dev/', openrouter_app_name='OpenHands', num_retries=4, retry_multiplier=2, retry_min_wait=5, retry_max_wait=30, timeout=None, max_message_chars=30000, temperature=1, top_p=1, custom_llm_provider=None, max_input_tokens=128000, max_output_tokens=65536, input_cost_per_token=None, output_cost_per_token=None, ollama_base_url=None, drop_params=True, modify_params=True, disable_vision=None, caching_prompt=True, log_completions=False, log_completions_folder='/mnt/f/OpenHands/Kevin/logs/completions', custom_tokenizer=None, native_tool_calling=None, reasoning_effort='high') keep_first=3 max_size=40
22:38:29 - openhands:INFO: llm.py:167 - self.config.model='openai/o1-mini'
22:38:29 - openhands:INFO: llm.py:168 - self.config.max_input_tokens=128000
22:38:29 - openhands:INFO: llm.py:169 - self.config.max_output_tokens=65536
INFO: 127.0.0.1:38748 - "POST /api/conversations HTTP/1.1" 200 OK
INFO: 127.0.0.1:38748 - "GET /api/conversations/6e538aff1fb248408e9034bc7e100d0e HTTP/1.1" 200 OK
INFO: ('127.0.0.1', 38764) - "WebSocket /socket.io/?latest_event_id=-1&conversation_id=6e538aff1fb248408e9034bc7e100d0e&EIO=4&transport=websocket" [accepted]
INFO: connection open
22:38:30 - openhands:INFO: listen_socket.py:30 - sio:connect: DgXdZ3v06swCCycsAAAB
22:38:30 - openhands:INFO: standalone_conversation_manager.py:92 - join_conversation:6e538aff1fb248408e9034bc7e100d0e:DgXdZ3v06swCCycsAAAB
22:38:30 - openhands:INFO: standalone_conversation_manager.py:222 - _get_event_stream:6e538aff1fb248408e9034bc7e100d0e
22:38:30 - openhands:INFO: standalone_conversation_manager.py:225 - found_local_agent_loop:6e538aff1fb248408e9034bc7e100d0e
22:38:30 - openhands:INFO: docker_runtime.py:177 - Using existing Docker container: kevin-runtime-persisted-oh-_mnt_f_OpenHands_Kevin_workspace
22:38:30 - openhands:INFO: docker_runtime.py:418 - Container status: running
22:38:30 - openhands:INFO: action_execution_client.py:106 - Checking if runtime is alive
22:38:30 - openhands:INFO: docker_runtime.py:227 - Copied selenium files to runtime
22:38:30 - openhands:INFO: base.py:309 - Selected repo: None, loading microagents from /workspace/.openhands/microagents (inside runtime)
22:38:30 - USER_ACTION
MessageAction (source=EventSource.USER)
CONTENT: hello
22:38:30 - openhands:INFO: agent_controller.py:502 - Setting agent(CodeActAgent) state from AgentState.LOADING to AgentState.RUNNING
CodeActAgent LEVEL 0 LOCAL STEP 1 GLOBAL STEP 1
22:38:30 - openhands:INFO: llm.py:863 - Token count: 2640
22:38:30 - openhands:INFO: docker_runtime.py:177 - Using existing Docker container: kevin-runtime-persisted-oh-_mnt_f_OpenHands_Kevin_workspace
22:38:30 - openhands:INFO: docker_runtime.py:418 - Container status: running
22:38:30 - openhands:INFO: action_execution_client.py:106 - Checking if runtime is alive
22:38:30 - openhands:INFO: docker_runtime.py:227 - Copied selenium files to runtime
22:38:31 - openhands:INFO: standalone_conversation_manager.py:83 - Conversation 6e538aff1fb248408e9034bc7e100d0e connected in 0.06767416000366211 seconds
22:38:31 - openhands:INFO: standalone_conversation_manager.py:64 - Reusing active conversation 6e538aff1fb248408e9034bc7e100d0e
INFO: 127.0.0.1:38760 - "GET /api/conversations/6e538aff1fb248408e9034bc7e100d0e/vscode-url HTTP/1.1" 200 OK
INFO: 127.0.0.1:38748 - "GET /api/conversations/6e538aff1fb248408e9034bc7e100d0e/list-files HTTP/1.1" 200 OK
22:38:43 - openhands:ERROR: agent_controller.py:247 - Error while running the agent (session ID: 6e538aff1fb248408e9034bc7e100d0e): litellm.APIError: APIError: OpenAIException - Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}. Traceback: Traceback (most recent call last):
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 726, in completion
raise e
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 653, in completion
self.make_sync_openai_chat_completion_request(
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/litellm_core_utils/logging_utils.py", line 145, in sync_wrapper
result = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 472, in make_sync_openai_chat_completion_request
raise e
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 454, in make_sync_openai_chat_completion_request
raw_response = openai_client.chat.completions.with_raw_response.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_legacy_response.py", line 364, in wrapped
return cast(LegacyAPIResponse[R], func(*args, **kwargs))
^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_utils/_utils.py", line 279, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/resources/chat/completions/completions.py", line 879, in create
return self._post(
^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1290, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 967, in request
return self._request(
^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1056, in _request
return self._retry_request(
^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1105, in _retry_request
return self._request(
^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1056, in _request
return self._retry_request(
^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1105, in _retry_request
return self._request(
^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1071, in _request
raise self._make_status_error_from_response(err.response) from None
openai.InternalServerError: Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/main.py", line 1724, in completion
raise e
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/main.py", line 1697, in completion
response = openai_chat_completions.completion(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 736, in completion
raise OpenAIError(
litellm.llms.openai.common_utils.OpenAIError: Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/mnt/f/OpenHands/Kevin/openhands/controller/agent_controller.py", line 245, in _step_with_exception_handling
await self._step()
File "/mnt/f/OpenHands/Kevin/openhands/controller/agent_controller.py", line 767, in _step
raise e
File "/mnt/f/OpenHands/Kevin/openhands/controller/agent_controller.py", line 723, in _step
action = self.agent.step(self.state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/f/OpenHands/Kevin/openhands/agenthub/codeact_agent/codeact_agent.py", line 485, in step
response = self.llm.completion(**params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 336, in wrapped_f
return copy(f, *args, **kw)
^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 475, in call
do = self.iter(retry_state=retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 376, in iter
result = action(retry_state)
^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 398, in
self._add_action_func(lambda rs: rs.outcome.result())
^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/concurrent/futures/_base.py", line 449, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 478, in call
result = fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/mnt/f/OpenHands/Kevin/openhands/llm/llm.py", line 403, in wrapper
resp = self._completion_unwrapped(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/utils.py", line 1190, in wrapper
raise e
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/utils.py", line 1068, in wrapper
result = original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/main.py", line 3085, in completion
raise exception_type(
^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2202, in exception_type
raise e
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 452, in exception_type
raise APIError(
litellm.exceptions.APIError: litellm.APIError: APIError: OpenAIException - Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}

22:38:43 - openhands:INFO: agent_controller.py:502 - Setting agent(CodeActAgent) state from AgentState.RUNNING to AgentState.ERROR
22:38:43 - openhands:INFO: agent_controller.py:502 - Setting agent(CodeActAgent) state from AgentState.ERROR to running
CodeActAgent LEVEL 0 LOCAL STEP 2 GLOBAL STEP 2
22:38:43 - openhands:INFO: llm.py:863 - Token count: 2640
22:38:43 - openhands:INFO: standalone_conversation_manager.py:71 - Reusing detached conversation 6e538aff1fb248408e9034bc7e100d0e
INFO: 127.0.0.1:44166 - "GET /api/conversations/6e538aff1fb248408e9034bc7e100d0e/list-files HTTP/1.1" 200 OK
INFO: 127.0.0.1:44166 - "GET /beep.wav HTTP/1.1" 206 Partial Content
INFO: 127.0.0.1:44166 - "GET /favicon.ico HTTP/1.1" 200 OK
22:38:51 - openhands:ERROR: agent_controller.py:247 - Error while running the agent (session ID: 6e538aff1fb248408e9034bc7e100d0e): litellm.APIError: APIError: OpenAIException - Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}. Traceback: Traceback (most recent call last):
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 726, in completion
raise e
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 653, in completion
self.make_sync_openai_chat_completion_request(
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/litellm_core_utils/logging_utils.py", line 145, in sync_wrapper
result = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 472, in make_sync_openai_chat_completion_request
raise e
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 454, in make_sync_openai_chat_completion_request
raw_response = openai_client.chat.completions.with_raw_response.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_legacy_response.py", line 364, in wrapped
return cast(LegacyAPIResponse[R], func(*args, **kwargs))
^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_utils/_utils.py", line 279, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/resources/chat/completions/completions.py", line 879, in create
return self._post(
^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1290, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 967, in request
return self._request(
^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1056, in _request
return self._retry_request(
^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1105, in _retry_request
return self._request(
^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1056, in _request
return self._retry_request(
^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1105, in _retry_request
return self._request(
^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1071, in _request
raise self._make_status_error_from_response(err.response) from None
openai.InternalServerError: Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/main.py", line 1724, in completion
raise e
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/main.py", line 1697, in completion
response = openai_chat_completions.completion(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 736, in completion
raise OpenAIError(
litellm.llms.openai.common_utils.OpenAIError: Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/mnt/f/OpenHands/Kevin/openhands/controller/agent_controller.py", line 245, in _step_with_exception_handling
await self._step()
File "/mnt/f/OpenHands/Kevin/openhands/controller/agent_controller.py", line 767, in _step
raise e
File "/mnt/f/OpenHands/Kevin/openhands/controller/agent_controller.py", line 723, in _step
action = self.agent.step(self.state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/f/OpenHands/Kevin/openhands/agenthub/codeact_agent/codeact_agent.py", line 485, in step
response = self.llm.completion(**params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 336, in wrapped_f
return copy(f, *args, **kw)
^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 475, in call
do = self.iter(retry_state=retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 376, in iter
result = action(retry_state)
^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 398, in
self._add_action_func(lambda rs: rs.outcome.result())
^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/concurrent/futures/_base.py", line 449, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 478, in call
result = fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/mnt/f/OpenHands/Kevin/openhands/llm/llm.py", line 403, in wrapper
resp = self._completion_unwrapped(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/utils.py", line 1190, in wrapper
raise e
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/utils.py", line 1068, in wrapper
result = original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/main.py", line 3085, in completion
raise exception_type(
^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2202, in exception_type
raise e
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 452, in exception_type
raise APIError(
litellm.exceptions.APIError: litellm.APIError: APIError: OpenAIException - Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}

22:38:51 - openhands:INFO: agent_controller.py:502 - Setting agent(CodeActAgent) state from running to AgentState.ERROR
22:38:51 - OBSERVATION
AgentStateChangedObservation(content='From AgentState.LOADING', agent_state='running', observation='agent_state_changed')
22:38:52 - OBSERVATION
AgentStateChangedObservation(content='From AgentState.RUNNING', agent_state='error', observation='agent_state_changed')
22:38:52 - OBSERVATION
ErrorObservation
RuntimeError: There was an unexpected error while running the agent. Please report this error to the developers. Your session ID is 6e538aff1fb248408e9034bc7e100d0e. Error: litellm.APIError: APIError: OpenAIException - Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}
22:38:52 - OBSERVATION
AgentStateChangedObservation(content='From AgentState.ERROR', agent_state='running', observation='agent_state_changed')
22:38:52 - OBSERVATION
AgentStateChangedObservation(content='From running', agent_state='error', observation='agent_state_changed')
22:38:52 - OBSERVATION
ErrorObservation
RuntimeError: There was an unexpected error while running the agent. Please report this error to the developers. Your session ID is 6e538aff1fb248408e9034bc7e100d0e. Error: litellm.APIError: APIError: OpenAIException - Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}

@SmartManoj
Copy link
Owner

SmartManoj commented Feb 20, 2025

Oops, updated.

Kevin/openhands/llm/llm.py

Lines 267 to 268 in 67a6949

if self.config.model.split('/')[-1].startswith('o1-mini'):
kwargs['messages'].append(Message(role='assistant', content=[TextContent(text='DO NOT PRODUCE INVALID CONTENT')]))

No need to build again for this small change.

@PoisonedPorkchop
Copy link

I'm assuming there's an import missing or something?

23:10:10 - USER_ACTION
MessageAction (source=EventSource.USER)
CONTENT: hello
23:10:10 - openhands:INFO: agent_controller.py:502 - Setting agent(CodeActAgent) state from AgentState.LOADING to AgentState.RUNNING
CodeActAgent LEVEL 0 LOCAL STEP 1 GLOBAL STEP 1
23:10:10 - openhands:ERROR: agent_controller.py:247 - Error while running the agent (session ID: 4e1974ce318a4f73bbce71f0392ecb80): name 'TextContent' is not defined. Traceback: Traceback (most recent call last):
File "/mnt/f/OpenHands/Kevin/openhands/controller/agent_controller.py", line 245, in _step_with_exception_handling
await self._step()
File "/mnt/f/OpenHands/Kevin/openhands/controller/agent_controller.py", line 767, in _step
raise e
File "/mnt/f/OpenHands/Kevin/openhands/controller/agent_controller.py", line 723, in _step
action = self.agent.step(self.state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/f/OpenHands/Kevin/openhands/agenthub/codeact_agent/codeact_agent.py", line 485, in step
response = self.llm.completion(**params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 336, in wrapped_f
return copy(f, *args, **kw)
^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 475, in call
do = self.iter(retry_state=retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 376, in iter
result = action(retry_state)
^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 398, in
self._add_action_func(lambda rs: rs.outcome.result())
^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/concurrent/futures/_base.py", line 449, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 478, in call
result = fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/mnt/f/OpenHands/Kevin/openhands/llm/llm.py", line 268, in wrapper
kwargs['messages'].append(Message(role='assistant', content=[TextContent(text='DO NOT PRODUCE INVALID CONTENT')]))
^^^^^^^^^^^
NameError: name 'TextContent' is not defined

23:10:10 - openhands:INFO: agent_controller.py:502 - Setting agent(CodeActAgent) state from AgentState.RUNNING to AgentState.ERROR
23:10:10 - openhands:INFO: agent_controller.py:502 - Setting agent(CodeActAgent) state from AgentState.ERROR to running
23:10:10 - OBSERVATION
AgentStateChangedObservation(content='From AgentState.LOADING', agent_state='running', observation='agent_state_changed')
23:10:10 - OBSERVATION
AgentStateChangedObservation(content='From AgentState.RUNNING', agent_state='error', observation='agent_state_changed')
23:10:10 - OBSERVATION
ErrorObservation
RuntimeError: There was an unexpected error while running the agent. Please report this error to the developers. Your session ID is 4e1974ce318a4f73bbce71f0392ecb80. Error: name 'TextContent' is not defined
CodeActAgent LEVEL 0 LOCAL STEP 2 GLOBAL STEP 2
23:10:10 - openhands:ERROR: agent_controller.py:247 - Error while running the agent (session ID: 4e1974ce318a4f73bbce71f0392ecb80): name 'TextContent' is not defined. Traceback: Traceback (most recent call last):
File "/mnt/f/OpenHands/Kevin/openhands/controller/agent_controller.py", line 245, in _step_with_exception_handling
await self._step()
File "/mnt/f/OpenHands/Kevin/openhands/controller/agent_controller.py", line 767, in _step
raise e
File "/mnt/f/OpenHands/Kevin/openhands/controller/agent_controller.py", line 723, in _step
action = self.agent.step(self.state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/f/OpenHands/Kevin/openhands/agenthub/codeact_agent/codeact_agent.py", line 485, in step
response = self.llm.completion(**params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 336, in wrapped_f
return copy(f, *args, **kw)
^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 475, in call
do = self.iter(retry_state=retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 376, in iter
result = action(retry_state)
^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 398, in
self._add_action_func(lambda rs: rs.outcome.result())
^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/concurrent/futures/_base.py", line 449, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 478, in call
result = fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/mnt/f/OpenHands/Kevin/openhands/llm/llm.py", line 268, in wrapper
kwargs['messages'].append(Message(role='assistant', content=[TextContent(text='DO NOT PRODUCE INVALID CONTENT')]))
^^^^^^^^^^^
NameError: name 'TextContent' is not defined

@SmartManoj
Copy link
Owner

Thanks, fixed.

@PoisonedPorkchop
Copy link

Thanks. I also have to keep changing this from this to this so it doesn't error:
if self.config.model.split('/')[-1].startswith('o1-'):
# Message types: user and assistant messages only, system messages are not supported.
messages[0].role = 'user'

        if self.config.model.split('/')[-1].startswith('o1-'):
            # Message types: user and assistant messages only, system messages are not supported.
            messages[0]['role'] = 'user'

I tried it out though, it lets me get my foot in the door but no commands can be asked for or it causes this again:
Running the app...
23:34:37 - openhands:INFO: server_config.py:39 - Using config class None
INFO: Started server process [9329]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://127.0.0.1:3000 (Press CTRL+C to quit)
INFO: 127.0.0.1:43020 - "GET / HTTP/1.1" 200 OK
INFO: 127.0.0.1:43020 - "GET /locales/en/translation.json HTTP/1.1" 200 OK
INFO: 127.0.0.1:43020 - "GET /api/options/config HTTP/1.1" 200 OK
INFO: 127.0.0.1:43020 - "GET /api/settings HTTP/1.1" 200 OK
INFO: 127.0.0.1:43020 - "GET /logo.png HTTP/1.1" 200 OK
INFO: 127.0.0.1:43020 - "GET /api/settings HTTP/1.1" 200 OK
INFO: 127.0.0.1:43020 - "GET /favicon.ico HTTP/1.1" 200 OK
INFO: 127.0.0.1:43034 - "GET /logo.png HTTP/1.1" 200 OK
INFO: 127.0.0.1:43020 - "GET /api/github/user HTTP/1.1" 200 OK
INFO: 127.0.0.1:43028 - "GET /api/github/repositories?sort=pushed&page=1&per_page=100 HTTP/1.1" 200 OK
23:34:49 - openhands:INFO: manage_conversations.py:136 - Initializing new conversation
23:34:49 - openhands:INFO: manage_conversations.py:54 - Loading settings
23:34:49 - openhands:INFO: manage_conversations.py:57 - Settings loaded
23:34:49 - openhands:INFO: manage_conversations.py:81 - Loading conversation store
23:34:49 - openhands:INFO: manage_conversations.py:83 - Conversation store loaded
23:34:50 - openhands:INFO: manage_conversations.py:89 - New conversation ID: 15e4dc8a28654104b1ec76bebe5a64ec
23:34:50 - openhands:INFO: manage_conversations.py:96 - Saving metadata for conversation 15e4dc8a28654104b1ec76bebe5a64ec
23:34:50 - openhands:INFO: manage_conversations.py:107 - Starting agent loop for conversation 15e4dc8a28654104b1ec76bebe5a64ec
23:34:50 - openhands:INFO: standalone_conversation_manager.py:192 - maybe_start_agent_loop:15e4dc8a28654104b1ec76bebe5a64ec
23:34:50 - openhands:INFO: standalone_conversation_manager.py:195 - start_agent_loop:15e4dc8a28654104b1ec76bebe5a64ec
23:34:50 - openhands:INFO: standalone_conversation_manager.py:222 - _get_event_stream:15e4dc8a28654104b1ec76bebe5a64ec
23:34:50 - openhands:INFO: standalone_conversation_manager.py:225 - found_local_agent_loop:15e4dc8a28654104b1ec76bebe5a64ec
23:34:50 - openhands:INFO: manage_conversations.py:125 - Finished initializing conversation 15e4dc8a28654104b1ec76bebe5a64ec
23:34:50 - openhands:INFO: llm.py:168 - self.config.model='openai/o1-mini'
23:34:50 - openhands:INFO: llm.py:169 - self.config.max_input_tokens=128000
23:34:50 - openhands:INFO: llm.py:170 - self.config.max_output_tokens=65536
23:34:50 - openhands:INFO: session.py:127 - Enabling default condenser: type='llm' llm_config=LLMConfig(use_group=None, enable_cache=False, seed=42, model='openai/o1-mini', api_key=SecretStr('**********'), base_url='', api_version=None, embedding_model='local', embedding_base_url=None, embedding_deployment_name=None, aws_access_key_id=None, aws_secret_access_key=None, aws_region_name=None, openrouter_site_url='https://docs.all-hands.dev/', openrouter_app_name='OpenHands', num_retries=4, retry_multiplier=2, retry_min_wait=5, retry_max_wait=30, timeout=None, max_message_chars=30000, temperature=1, top_p=1, custom_llm_provider=None, max_input_tokens=128000, max_output_tokens=65536, input_cost_per_token=None, output_cost_per_token=None, ollama_base_url=None, drop_params=True, modify_params=True, disable_vision=None, caching_prompt=True, log_completions=False, log_completions_folder='/mnt/f/OpenHands/Kevin/logs/completions', custom_tokenizer=None, native_tool_calling=None, reasoning_effort='high') keep_first=3 max_size=40
23:34:50 - openhands:INFO: llm.py:168 - self.config.model='openai/o1-mini'
23:34:50 - openhands:INFO: llm.py:169 - self.config.max_input_tokens=128000
23:34:50 - openhands:INFO: llm.py:170 - self.config.max_output_tokens=65536
INFO: 127.0.0.1:43044 - "POST /api/conversations HTTP/1.1" 200 OK
INFO: 127.0.0.1:43044 - "GET /api/conversations/15e4dc8a28654104b1ec76bebe5a64ec HTTP/1.1" 200 OK
INFO: ('127.0.0.1', 34362) - "WebSocket /socket.io/?latest_event_id=-1&conversation_id=15e4dc8a28654104b1ec76bebe5a64ec&EIO=4&transport=websocket" [accepted]
INFO: connection open
23:34:50 - openhands:INFO: listen_socket.py:30 - sio:connect: Zft6JZWJWD8j232uAAAB
23:34:50 - openhands:INFO: standalone_conversation_manager.py:92 - join_conversation:15e4dc8a28654104b1ec76bebe5a64ec:Zft6JZWJWD8j232uAAAB
23:34:50 - openhands:INFO: standalone_conversation_manager.py:222 - _get_event_stream:15e4dc8a28654104b1ec76bebe5a64ec
23:34:50 - openhands:INFO: standalone_conversation_manager.py:225 - found_local_agent_loop:15e4dc8a28654104b1ec76bebe5a64ec
23:34:51 - openhands:INFO: docker_runtime.py:151 - Creating new Docker container for kevin-runtime-persisted-oh-_mnt_f_OpenHands_Kevin_workspace
23:34:52 - openhands:INFO: runtime_build.py:182 - Building image: ghcr.io/all-hands-ai/runtime:oh_v0.25.0_we8ywhfe2dm8vpfn_fg7w005z0iuc76go
23:34:52 - openhands:INFO: docker_runtime.py:167 - Starting runtime with image: ghcr.io/all-hands-ai/runtime:oh_v0.25.0_we8ywhfe2dm8vpfn_fg7w005z0iuc76go
23:34:53 - openhands:INFO: docker_runtime.py:171 - Container started: kevin-runtime-persisted-oh-_mnt_f_OpenHands_Kevin_workspace. VSCode URL: None
23:34:53 - openhands:INFO: docker_runtime.py:194 - Waiting for client to become ready at http://localhost:63710...
23:34:53 - openhands:INFO: action_execution_client.py:107 - Checking if runtime is alive
23:34:55 - openhands:INFO: action_execution_client.py:107 - Checking if runtime is alive
23:34:57 - openhands:INFO: action_execution_client.py:107 - Checking if runtime is alive
23:34:59 - openhands:INFO: action_execution_client.py:107 - Checking if runtime is alive
23:35:01 - openhands:INFO: action_execution_client.py:107 - Checking if runtime is alive
23:35:03 - openhands:INFO: action_execution_client.py:107 - Checking if runtime is alive
23:35:05 - openhands:INFO: action_execution_client.py:107 - Checking if runtime is alive
23:35:07 - openhands:INFO: action_execution_client.py:107 - Checking if runtime is alive
23:35:09 - openhands:INFO: action_execution_client.py:107 - Checking if runtime is alive
23:35:11 - openhands:INFO: action_execution_client.py:107 - Checking if runtime is alive
23:35:13 - openhands:INFO: action_execution_client.py:107 - Checking if runtime is alive
23:35:15 - openhands:INFO: action_execution_client.py:107 - Checking if runtime is alive
23:35:17 - openhands:INFO: action_execution_client.py:107 - Checking if runtime is alive
23:35:17 - openhands:INFO: docker_runtime.py:209 - Runtime is ready.
23:35:17 - openhands:INFO: docker_runtime.py:227 - Copied selenium files to runtime
23:35:17 - openhands:INFO: base.py:312 - Selected repo: None, loading microagents from /workspace/.openhands/microagents (inside runtime)
23:35:18 - USER_ACTION
MessageAction (source=EventSource.USER)
CONTENT: hello there
23:35:18 - openhands:INFO: agent_controller.py:502 - Setting agent(CodeActAgent) state from AgentState.LOADING to AgentState.RUNNING
CodeActAgent LEVEL 0 LOCAL STEP 1 GLOBAL STEP 1
23:35:18 - openhands:INFO: llm.py:866 - Token count: 2652
23:35:18 - openhands:INFO: docker_runtime.py:177 - Using existing Docker container: kevin-runtime-persisted-oh-_mnt_f_OpenHands_Kevin_workspace
23:35:18 - openhands:INFO: docker_runtime.py:418 - Container status: running
23:35:18 - openhands:INFO: action_execution_client.py:107 - Checking if runtime is alive
23:35:18 - openhands:INFO: docker_runtime.py:227 - Copied selenium files to runtime
23:35:18 - openhands:INFO: standalone_conversation_manager.py:83 - Conversation 15e4dc8a28654104b1ec76bebe5a64ec connected in 0.05824923515319824 seconds
23:35:18 - openhands:INFO: standalone_conversation_manager.py:64 - Reusing active conversation 15e4dc8a28654104b1ec76bebe5a64ec
INFO: 127.0.0.1:43068 - "GET /api/conversations/15e4dc8a28654104b1ec76bebe5a64ec/vscode-url HTTP/1.1" 200 OK
INFO: 127.0.0.1:43056 - "GET /api/conversations/15e4dc8a28654104b1ec76bebe5a64ec/list-files HTTP/1.1" 200 OK
23:35:26 - OBSERVATION
AgentStateChangedObservation(content='From AgentState.LOADING', agent_state='running', observation='agent_state_changed')
23:35:26 - openhands:INFO: agent_controller.py:502 - Setting agent(CodeActAgent) state from AgentState.RUNNING to AgentState.FINISHED
23:35:26 - OBSERVATION
AgentStateChangedObservation(content='From AgentState.RUNNING', agent_state='finished', observation='agent_state_changed')
23:35:27 - openhands:INFO: docker_runtime.py:177 - Using existing Docker container: kevin-runtime-persisted-oh-_mnt_f_OpenHands_Kevin_workspace
INFO: 127.0.0.1:43684 - "GET /beep.wav HTTP/1.1" 206 Partial Content
23:35:27 - openhands:INFO: docker_runtime.py:418 - Container status: running
23:35:27 - openhands:INFO: action_execution_client.py:107 - Checking if runtime is alive
23:35:27 - openhands:INFO: docker_runtime.py:227 - Copied selenium files to runtime
23:35:27 - openhands:INFO: standalone_conversation_manager.py:83 - Conversation 15e4dc8a28654104b1ec76bebe5a64ec connected in 0.06962704658508301 seconds
INFO: 127.0.0.1:43684 - "GET /favicon.ico HTTP/1.1" 200 OK
INFO: 127.0.0.1:43686 - "GET /api/conversations/15e4dc8a28654104b1ec76bebe5a64ec/list-files HTTP/1.1" 200 OK
23:35:53 - USER_ACTION
MessageAction (source=EventSource.USER)
CONTENT: Create a file for me. Name it kitty.txt
23:35:53 - openhands:INFO: agent_controller.py:502 - Setting agent(CodeActAgent) state from AgentState.FINISHED to AgentState.RUNNING
CodeActAgent LEVEL 0 LOCAL STEP 2 GLOBAL STEP 2
23:35:53 - openhands:INFO: llm.py:866 - Token count: 2679
23:35:53 - openhands:INFO: docker_runtime.py:177 - Using existing Docker container: kevin-runtime-persisted-oh-_mnt_f_OpenHands_Kevin_workspace
23:35:53 - openhands:INFO: docker_runtime.py:418 - Container status: running
23:35:53 - openhands:INFO: action_execution_client.py:107 - Checking if runtime is alive
23:35:53 - openhands:INFO: docker_runtime.py:227 - Copied selenium files to runtime
23:35:53 - openhands:INFO: standalone_conversation_manager.py:83 - Conversation 15e4dc8a28654104b1ec76bebe5a64ec connected in 0.06095576286315918 seconds
INFO: 127.0.0.1:43700 - "GET /api/conversations/15e4dc8a28654104b1ec76bebe5a64ec/list-files HTTP/1.1" 200 OK
23:36:09 - openhands:ERROR: agent_controller.py:247 - Error while running the agent (session ID: 15e4dc8a28654104b1ec76bebe5a64ec): litellm.APIError: APIError: OpenAIException - Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}. Traceback: Traceback (most recent call last):
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 726, in completion
raise e
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 653, in completion
self.make_sync_openai_chat_completion_request(
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/litellm_core_utils/logging_utils.py", line 145, in sync_wrapper
result = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 472, in make_sync_openai_chat_completion_request
raise e
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 454, in make_sync_openai_chat_completion_request
raw_response = openai_client.chat.completions.with_raw_response.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_legacy_response.py", line 364, in wrapped
return cast(LegacyAPIResponse[R], func(*args, **kwargs))
^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_utils/_utils.py", line 279, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/resources/chat/completions/completions.py", line 879, in create
return self._post(
^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1290, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 967, in request
return self._request(
^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1056, in _request
return self._retry_request(
^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1105, in _retry_request
return self._request(
^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1056, in _request
return self._retry_request(
^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1105, in _retry_request
return self._request(
^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/openai/_base_client.py", line 1071, in _request
raise self._make_status_error_from_response(err.response) from None
openai.InternalServerError: Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/main.py", line 1724, in completion
raise e
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/main.py", line 1697, in completion
response = openai_chat_completions.completion(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 736, in completion
raise OpenAIError(
litellm.llms.openai.common_utils.OpenAIError: Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/mnt/f/OpenHands/Kevin/openhands/controller/agent_controller.py", line 245, in _step_with_exception_handling
await self._step()
File "/mnt/f/OpenHands/Kevin/openhands/controller/agent_controller.py", line 767, in _step
raise e
File "/mnt/f/OpenHands/Kevin/openhands/controller/agent_controller.py", line 723, in _step
action = self.agent.step(self.state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/f/OpenHands/Kevin/openhands/agenthub/codeact_agent/codeact_agent.py", line 485, in step
response = self.llm.completion(**params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 336, in wrapped_f
return copy(f, *args, **kw)
^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 475, in call
do = self.iter(retry_state=retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 376, in iter
result = action(retry_state)
^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 398, in
self._add_action_func(lambda rs: rs.outcome.result())
^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/concurrent/futures/_base.py", line 449, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 478, in call
result = fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/mnt/f/OpenHands/Kevin/openhands/llm/llm.py", line 405, in wrapper
resp = self._completion_unwrapped(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/utils.py", line 1190, in wrapper
raise e
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/utils.py", line 1068, in wrapper
result = original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/main.py", line 3085, in completion
raise exception_type(
^^^^^^^^^^^^^^^
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2202, in exception_type
raise e
File "/home/poisonedporkchop/.cache/pypoetry/virtualenvs/openhands-ai-U4dmJb3Z-py3.12/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 452, in exception_type
raise APIError(
litellm.exceptions.APIError: litellm.APIError: APIError: OpenAIException - Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}

23:36:09 - openhands:INFO: agent_controller.py:502 - Setting agent(CodeActAgent) state from AgentState.RUNNING to AgentState.ERROR
23:36:09 - OBSERVATION
AgentStateChangedObservation(content='From AgentState.FINISHED', agent_state='running', observation='agent_state_changed')
23:36:09 - OBSERVATION
AgentStateChangedObservation(content='From AgentState.RUNNING', agent_state='error', observation='agent_state_changed')
23:36:09 - OBSERVATION
ErrorObservation
RuntimeError: There was an unexpected error while running the agent. Please report this error to the developers. Your session ID is 15e4dc8a28654104b1ec76bebe5a64ec. Error: litellm.APIError: APIError: OpenAIException - Error code: 500 - {'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}

@SmartManoj
Copy link
Owner

but the test file works?

@PoisonedPorkchop
Copy link

Well if I leave the request as generated when it errors above in request.json and run that python test again, no it does not work and returns the same error 500.

@SmartManoj
Copy link
Owner

Does the generated request.json contain the new assistant message?

@PoisonedPorkchop
Copy link

Yes, I confirmed the new message is in there, I also tried to move it around to see if that would help, doesn't seem to work. Btw I did try the same instructions in 4o a few minutes ago and it works.

@SmartManoj
Copy link
Owner

SmartManoj commented Feb 20, 2025

Would you compare the new request.json with the one that was worked on? https://www.diffchecker.com/

@PoisonedPorkchop
Copy link

Here is my original one I got working: (but on the right it breaks)
https://www.diffchecker.com/7P0Vh7yf/
My original one again that works vs the new generated one.
https://www.diffchecker.com/bRvKVudy/
But it's pretty obvious, its when it tried to issue a command or do anything that it errors.

@SmartManoj
Copy link
Owner

In the first one, the user messages are repeated consecutively.

If you change "Please read yes.txt and try out the steps in it" to "hello again", does it work?

@PoisonedPorkchop
Copy link

I tested "hello again", "what are you?". It answered both. I just manually changed it to:
{"content": [{"type": "text", "text": "do the ls command please?"}], "role": "user"}
and it failed once again

@SmartManoj SmartManoj reopened this Feb 20, 2025
@SmartManoj
Copy link
Owner

If you remove the first message, does it work?

@PoisonedPorkchop
Copy link

If I removed my own first message, no it does not work. But if I remove all the other messages (including the custom instructions which tell it how to run commands and such, it seems something in there is causing the error), and then ask it to "do the ls command please?". it replies that it doesn't know how to execute commands in my environment since that was in the instructions I had to delete to get the command to go through.

@PoisonedPorkchop
Copy link

My best guess is since mini isn't built the same it can't handle all those instructions or one specific instruction it is unable to comply with but I will have to test further as there are a lot of those custom instructions.

@SmartManoj
Copy link
Owner

SmartManoj commented Feb 20, 2025

Would you simplify the first msg to only work with the ls command like "You can execute system commands using <execute_bash>ls</<execute_bash>"?

@PoisonedPorkchop
Copy link

After some rigorous testing on what is 'acceptable' and what isn't. I found out some things. First, it refused to do anything with the specific command string '<execute_bash>' included in the instructions, it seems that always resulted in the error 500 and may very well be the cause of it. I tried with different words and got it to just do ' ls ' and 'START_COMMAND ls END_COMMAND', both worked. Also, It would still verbally refuse until I reassured it that the command wasn't actually being run, aka lying to it to get it to comply. Here are the contents of my request file I got working:
[[], {"messages": [{"content": [{"type": "text", "text": "The assistant can SIMULATE bash commands wrapped with by saying and , e.g. ' ls '. the agent doesnt need to worry about the users system because nothing is being run, it is only to simulate. try to do the ls command now by saying what i told you to say when you need to run a command please"}], "role": "user"}], "stop": ["</execute_ipython>", "</execute_bash>", "</execute_browse>", "</file_edit>"]}]

@SmartManoj
Copy link
Owner

Does it work if you say that the commands will be executed in a secured isolated docker container?

@PoisonedPorkchop
Copy link

Nope, it still replies that it cannot execute commands even when I inform it of docker. You have to trick it into a hypothetical or say its simulated or something.

@PoisonedPorkchop
Copy link

I'm thinking this whole thing now is because of restrictions set by OpenAI on how we are supposed to use this model aka no automatic commands and stuff.

@PoisonedPorkchop
Copy link

Very clever prompts may work to get around it for now I think though.

@SmartManoj
Copy link
Owner

SmartManoj commented Feb 20, 2025

FYI, the system prompts are defined in this file.

@SmartManoj
Copy link
Owner

Would be good if you post the MRE here.

@PoisonedPorkchop
Copy link

Okay I have some important information for you. I don't think this is actually a problem with openai after all(or at least not entirely). Inside of the request.json I noticed there was "stop" at the end. Removing everything from the stop definition and leaving an empty array around line 444 in openhands/agenthub/codeact_agent/codeact_agent.py allowed the commands to go through as intended in the ui and everything is working mostly as intended now, except one instance where it started using python to print its statements and replying to its own python statements forever in a loop until I stopped it.

@SmartManoj
Copy link
Owner

SmartManoj commented Feb 20, 2025

Seems they simply check if the tags are properly closed?

Would you share the chat history link using the thumbs-down (👎) button above the message input field?

In upstream, the stop parameter is not used.

@SmartManoj
Copy link
Owner

SmartManoj commented Feb 20, 2025

Image
Some steps are missing. Would you create a new chat?

Need to run the following in the Notebook for the NameError: name 'open_file' is not defined
from openhands.runtime.plugins.agent_skills.agentskills import *

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants