Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Errno 111] Connection refused #146

Open
dpaluy opened this issue Jan 24, 2025 · 22 comments
Open

[Errno 111] Connection refused #146

dpaluy opened this issue Jan 24, 2025 · 22 comments

Comments

@dpaluy
Copy link

dpaluy commented Jan 24, 2025

On Macbook M1, running via Docker.

I made the adjustment to the dockerfile as describe here: #100

But when I run the agent, I receive [Errno 111] Connection refused

How to debug/resolve this issue?

Thanks?

@chris-amaya
Copy link

chris-amaya commented Jan 26, 2025

i have the same issue but on Windows by only following the Docker installation

@mayphilc
Copy link

same issue using docker in windows

@barrettluke
Copy link

same issue on Macbook M1 with 16gig ram

@forero94
Copy link

Same issue here on windows
ERROR [agent] ❌ Result failed 1/5 times:
browser-use-webui-1 | [Errno 111] Connection refused
browser-use-webui-1 | 2025-01-27 14:14:35,560 DEBG 'webui' stdout output.

Image

@jasenwar
Copy link

I got the exact same m1 16gb ram

@TDMarko
Copy link

TDMarko commented Jan 28, 2025

Same on M3

@meteoro
Copy link

meteoro commented Jan 28, 2025

Same in WSL Win 11.

@telnemri
Copy link

Same here on Macbook Pro M4

@SussyD3V
Copy link

same here on linux fedora

@eEQK
Copy link

eEQK commented Jan 29, 2025

what fixed it for me is using host network mode:

services:
  browser-use-webui:
    # ...
    network_mode: host

@paulpenney
Copy link

what fixed it for me is using host network mode:

services:
browser-use-webui:
# ...
network_mode: host

I tried this and it didn't fix it for me - in fact I can't even get it to properly boot up, it seems to get stuck in some sort of error loop.

@TahaW863
Copy link

TahaW863 commented Jan 30, 2025

Just add this link to the base url shia: http://host.docker.internal:11434 for docker install
or add this for local http://localhost:11434

Image

Make sure ollama refreshed after each server restart.

@chen8160
Copy link

Just add this link to the base url shia: http://host.docker.internal:11434 for docker install or add this for local http://localhost:11434

Image

Make sure ollama refreshed after each server restart.

Having the same issue and tried this but no luck

@TahaW863
Copy link

Just add this link to the base url shia: http://host.docker.internal:11434 for docker install or add this for local http://localhost:11434
Image
Make sure ollama refreshed after each server restart.

Having the same issue and tried this but no luck

Then make sure in the requirements file to increase the browser-use version from 1.29.0 to 1.30.0

Its not perfect, sometimes the agent runs, and sometimes doesn't lol
also the browser options disable recording, and enable use of browser (left most option)
if doesn't work decrease the number of steps by half from agent settings.

@chris-amaya
Copy link

here's what i tried and still not luck

  • adding network host to docker-compose.yml
  • adding the base url to configuration
  • increasing the browser-use version
  • disabling recording
  • decreasing the number of step to very few

since this is like a network thing, probably by modifying something in the firewall might work?

@ishrek
Copy link

ishrek commented Jan 31, 2025

i'm run on mac somona chip intel 2018
Run agent with ollama ai

But when I run the agent, I receive [Errno 61] Connection refused

How to resolve this issue?
@warmshao
Thanks?

@fxgardes
Copy link

Same thing here ...

@vvincent1234
Copy link
Contributor

i'm run on mac somona chip intel 2018 Run agent with ollama ai

But when I run the agent, I receive [Errno 61] Connection refused

How to resolve this issue? @warmshao Thanks?

Please ensure that the url of ollama can be accessed. If you use the webui in docker, I don't think http://localhost:11434/ is the correct base url.

@iulko
Copy link

iulko commented Feb 1, 2025

Ollama can be accessed from other docker AI app like open webui but not from browser web-ui using same url. So there is some issue.
Edit: reffering to Errno 111

@ishrek
Copy link

ishrek commented Feb 8, 2025

i'm run on mac somona chip intel 2018 Run agent with ollama ai
But when I run the agent, I receive [Errno 61] Connection refused
How to resolve this issue? @warmshao Thanks?

Please ensure that the url of ollama can be accessed. If you use the webui in docker, I don't think http://localhost:11434/ is the correct base url.

@vvincent1234 I'm running directly on the terminal without docker. I just clone the source code and run. I try to many times but "Connection refused" every times

@th1nkd0g
Copy link

th1nkd0g commented Feb 13, 2025

I had same Errno111 while running the browser-use/webui via docker
on M4 mac

changed the .env files OLLAMA_ENDPOINT to below and worked for me

//.env
#OLLAMA_ENDPOINT=http://localhost:11434
OLLAMA_ENDPOINT=http://host.docker.internal:11434 # changed to docker internal host
//

Additional point that I also changed the docker-compose.yml file becoz it didn't detect the mac silicon somehow while build the docker image, so forced to use arm64 platform configuration

//docker-compose.yml
browser-use-webui:
platform: linux/arm64 # changed to arm64 from amd64
build:
context: .
dockerfile: ${DOCKERFILE:-Dockerfile}
args:
TARGETPLATFORM: ${TARGETPLATFORM:-linux/arm64} # changed to arm64 from amd64
//

@ATTO-RATHORE
Copy link

ATTO-RATHORE commented Feb 15, 2025

as i understood problem localhost url will not work because it is looking into container itself so only solution is run ollama as docker container docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama and then run model docker exec -it ollama ollama run qwen2.5:7b and then use base url http://yourhostname:11434 now i dont have issue but it is very slow for local model it taken task (go to google.com and type 'OpenAI' click search) for me to complete around 15 minute when we use locally deepseek and qwen model so better use google gemini gemini-2.0-flash-exp model with free api which can complete in 1 minute only

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests