-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Errno 111] Connection refused #146
Comments
i have the same issue but on Windows by only following the Docker installation |
same issue using docker in windows |
same issue on Macbook M1 with 16gig ram |
I got the exact same m1 16gb ram |
Same on M3 |
Same in WSL Win 11. |
Same here on Macbook Pro M4 |
same here on linux fedora |
what fixed it for me is using host network mode: services:
browser-use-webui:
# ...
network_mode: host |
I tried this and it didn't fix it for me - in fact I can't even get it to properly boot up, it seems to get stuck in some sort of error loop. |
Just add this link to the base url shia: http://host.docker.internal:11434 for docker install Make sure ollama refreshed after each server restart. |
Having the same issue and tried this but no luck |
Then make sure in the requirements file to increase the browser-use version from 1.29.0 to 1.30.0 Its not perfect, sometimes the agent runs, and sometimes doesn't lol |
here's what i tried and still not luck
since this is like a network thing, probably by modifying something in the firewall might work? |
i'm run on mac somona chip intel 2018 But when I run the agent, I receive [Errno 61] Connection refused How to resolve this issue? |
Same thing here ... |
Please ensure that the url of ollama can be accessed. If you use the webui in docker, I don't think http://localhost:11434/ is the correct base url. |
Ollama can be accessed from other docker AI app like open webui but not from browser web-ui using same url. So there is some issue. |
@vvincent1234 I'm running directly on the terminal without docker. I just clone the source code and run. I try to many times but "Connection refused" every times |
I had same Errno111 while running the browser-use/webui via docker changed the .env files OLLAMA_ENDPOINT to below and worked for me //.env Additional point that I also changed the docker-compose.yml file becoz it didn't detect the mac silicon somehow while build the docker image, so forced to use arm64 platform configuration //docker-compose.yml |
as i understood problem localhost url will not work because it is looking into container itself so only solution is run ollama as docker container docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama and then run model docker exec -it ollama ollama run qwen2.5:7b and then use base url http://yourhostname:11434 now i dont have issue but it is very slow for local model it taken task (go to google.com and type 'OpenAI' click search) for me to complete around 15 minute when we use locally deepseek and qwen model so better use google gemini gemini-2.0-flash-exp model with free api which can complete in 1 minute only |
On Macbook M1, running via Docker.
I made the adjustment to the dockerfile as describe here: #100
But when I run the agent, I receive
[Errno 111] Connection refused
How to debug/resolve this issue?
Thanks?
The text was updated successfully, but these errors were encountered: