You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In http://0.0.0.0:3000/ web UI, I config the settings to use local ollama as below.
After input prompt as below, the Agent runs.
I want to create a React app that allows me to:
See all the items on my todo list
add a new item to the list
mark an item as done
totally remove an item from the list
change the text of an item
set a due date on the item
This should be a client-only app with no backend. The list should persist in localStorage.
Please add tests for all of the above and make sure they pass
Expected Result
The Agent should at least build a scaffold of the application.
Actual Result
The Agent prompts to ask user to "Download files" and keep waiting for user input. The workspace is not automatically updated by LLM response.
I've tried qwen2.5-coder:7b, llama3.1:8b, and starcoder2:3b from ollama. All have same results. So, I don't think it is an issue of LLM. It looks OpenHands Agent couldn't deal with ollama response. If LLM response should be manually downloaded to files, there is no way to be automatic.
OpenHands Installation
app.all-hands.dev
OpenHands Version
0.17-nikolaik
Operating System
MacOS
Logs, Errors, Screenshots, and Additional Context
No response
The text was updated successfully, but these errors were encountered:
Hi @mocheng , unfortunately the models you tried don't have strong enough instruction following ability to work well with OpenHands. We recommend Claude-3.5-sonnet, but if you want to use an OpenModel, we'd recommend that you use llama-3.3-70B or deepseek-v3.
We'll also try to create a more extensive leaderboard to give recommendations for which models to use: #5744
Closing this for now, but you can monitor the leaderboard issue and take a look at the models on the leaderboard that fit your requirements.
Is there an existing issue for the same bug?
Describe the bug and reproduction steps
Repro Steps
The openhands docker is run with below command:
In

http://0.0.0.0:3000/
web UI, I config the settings to use local ollama as below.After input prompt as below, the Agent runs.
Expected Result
The Agent should at least build a scaffold of the application.
Actual Result
The Agent prompts to ask user to "Download files" and keep waiting for user input. The workspace is not automatically updated by LLM response.

I've tried
qwen2.5-coder:7b
,llama3.1:8b
, andstarcoder2:3b
from ollama. All have same results. So, I don't think it is an issue of LLM. It looks OpenHands Agent couldn't deal with ollama response. If LLM response should be manually downloaded to files, there is no way to be automatic.OpenHands Installation
app.all-hands.dev
OpenHands Version
0.17-nikolaik
Operating System
MacOS
Logs, Errors, Screenshots, and Additional Context
No response
The text was updated successfully, but these errors were encountered: