Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Working with Ollama couldn't be automatic #5928

Closed
1 task done
mocheng opened this issue Dec 31, 2024 · 1 comment
Closed
1 task done

[Bug]: Working with Ollama couldn't be automatic #5928

mocheng opened this issue Dec 31, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@mocheng
Copy link

mocheng commented Dec 31, 2024

Is there an existing issue for the same bug?

  • I have checked the existing issues.

Describe the bug and reproduction steps

Repro Steps

The openhands docker is run with below command:

docker run -it --rm --pull=always \
    -e SANDBOX_RUNTIME_CONTAINER_IMAGE=docker.all-hands.dev/all-hands-ai/runtime:0.17-nikolaik \
    -e LOG_ALL_EVENTS=true \
    -v /var/run/docker.sock:/var/run/docker.sock \
    -v ~/.openhands-state:/.openhands-state \
    -p 3000:3000 \
    --add-host host.docker.internal:host-gateway \
    -e LLM_OLLAMA_BASE_URL="http://host.docker.internal:11434" \
    --name openhands-app \
    docker.all-hands.dev/all-hands-ai/openhands:0.17

In http://0.0.0.0:3000/ web UI, I config the settings to use local ollama as below.
image

After input prompt as below, the Agent runs.

I want to create a React app that allows me to:

See all the items on my todo list
add a new item to the list
mark an item as done
totally remove an item from the list
change the text of an item
set a due date on the item
This should be a client-only app with no backend. The list should persist in localStorage.

Please add tests for all of the above and make sure they pass

Expected Result

The Agent should at least build a scaffold of the application.

Actual Result

The Agent prompts to ask user to "Download files" and keep waiting for user input. The workspace is not automatically updated by LLM response.
image

I've tried qwen2.5-coder:7b, llama3.1:8b, and starcoder2:3b from ollama. All have same results. So, I don't think it is an issue of LLM. It looks OpenHands Agent couldn't deal with ollama response. If LLM response should be manually downloaded to files, there is no way to be automatic.

OpenHands Installation

app.all-hands.dev

OpenHands Version

0.17-nikolaik

Operating System

MacOS

Logs, Errors, Screenshots, and Additional Context

No response

@mocheng mocheng added the bug Something isn't working label Dec 31, 2024
@neubig
Copy link
Contributor

neubig commented Dec 31, 2024

Hi @mocheng , unfortunately the models you tried don't have strong enough instruction following ability to work well with OpenHands. We recommend Claude-3.5-sonnet, but if you want to use an OpenModel, we'd recommend that you use llama-3.3-70B or deepseek-v3.

We'll also try to create a more extensive leaderboard to give recommendations for which models to use:
#5744

Closing this for now, but you can monitor the leaderboard issue and take a look at the models on the leaderboard that fit your requirements.

@neubig neubig closed this as completed Dec 31, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants