Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problem with using Ollama/codellama as local LLM engine in Ubuntu 24.04 #3196

Closed
2 tasks done
HenrikBach1 opened this issue Jul 31, 2024 · 10 comments
Closed
2 tasks done
Labels
bug Something isn't working severity:medium Affecting multiple users Stale Inactive for 30 days

Comments

@HenrikBach1
Copy link

Is there an existing issue for the same bug?

Describe the bug

Hi

I'm trying to get OpenDevin to communicate with codellama:7b.

I'm using this linux bash cli statement:

    # --pull=always \
docker run -it --rm \
    --add-host host.docker.internal:host-gateway \
    -e SANDBOX_USER_ID=$(id -u) \
    \
    -e LLM_API_KEY="ollama" \
    -e LLM_BASE_URL="http://host.docker.internal:11434" \
    -e LLM_OLLAMA_BASE_URL="http://host.docker.internal:11434" \
    \
    -e WORKSPACE_MOUNT_PATH=$WORKSPACE_BASE \
    -v $WORKSPACE_BASE:/opt/workspace_base \
    \
    -v /var/run/docker.sock:/var/run/docker.sock \
    -p 3000:3000 \
    --name opendevin-app-$(date +%Y%m%d%H%M%S) \
    ghcr.io/opendevin/opendevin:0.8.2

with the environment variable WORKSPACE_BASEpointing to a local test project.

OpenDevin starts and seems to be ready to take input from the user. But when the user gives input like "hi", after a long time, OD replies with:

There was an unexpected error while running the agent

and a lot of error messages in its shell:

sad@sad-HP-ZBook-17-G2:~/projects/cvechecker/opendevin-ollama$ ./opendevin.sh 
Starting OpenDevin...
Setting up enduser with id 1000
Docker socket group id: 1002
Creating group with id 1002
Running as enduser
/app/.venv/lib/python3.12/site-packages/llama_cloud/types/metadata_filter.py:20: SyntaxWarning: invalid escape sequence '\*'
  """
16:39:38 - opendevin:INFO: config.py:437 - Config file not found: [Errno 2] No such file or directory: 'config.toml'
INFO:     Started server process [42]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     Uvicorn running on http://0.0.0.0:3000 (Press CTRL+C to quit)
INFO:     ('172.17.0.1', 41350) - "WebSocket /ws?token=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzaWQiOiI1NjMwYzQ4OS1kYTg3LTQ5ZjktOTViOC1mMjRiNmZmNjFkNjQifQ.u1ypy37uZahppvcVFRKPE-qvRV63nU2G44UI9_ySTZc&latest_event_id=12" [accepted]
16:39:43 - opendevin:ERROR: auth.py:27 - Invalid token
INFO:     connection open
INFO:     connection closed
INFO:     ('172.17.0.1', 41362) - "WebSocket /ws" [accepted]
INFO:     connection open
16:39:49 - opendevin:INFO: agent.py:79 - Using runtime: server
16:39:49 - opendevin:INFO: ssh_box.py:135 - SSHBox is running as opendevin user with USER_ID=1000 in the sandbox
16:39:49 - opendevin:INFO: ssh_box.py:178 - Detected initial session.
16:39:49 - opendevin:INFO: ssh_box.py:180 - Creating new Docker container
16:39:49 - opendevin:WARNING: ssh_box.py:578 - Using port forwarding till the enable host network mode of Docker is out of experimental mode.Check the 897th issue on https://github.com/OpenDevin/OpenDevin/issues/ for more information.
16:39:49 - opendevin:INFO: ssh_box.py:586 - Mounting volumes: {'/home/sad/projects/cvechecker/opendevin-ollama/cvechecker-github-private': {'bind': '/workspace', 'mode': 'rw'}, '/tmp/cache': {'bind': '/home/opendevin/.cache', 'mode': 'rw'}}
16:39:49 - opendevin:INFO: ssh_box.py:597 - Container started
16:39:50 - opendevin:INFO: ssh_box.py:613 - waiting for container to start: 1, container status: running
16:39:52 - opendevin:INFO: ssh_box.py:342 - Connecting to SSH session...
16:39:52 - opendevin:INFO: ssh_box.py:345 - You can debug the SSH connection by running: `ssh -v -p 53375 opendevin@localhost` using the password 'f2b43572-2a03-40c1-914e-ed1c12429c13'
16:39:53 - opendevin:INFO: ssh_box.py:349 - Connected to SSH session
16:39:56 - opendevin:INFO: agent.py:100 - Creating agent CodeActAgent using LLM ollama/codellama
16:39:56 - opendevin:INFO: mixin.py:42 - Initializing plugins in the sandbox
16:39:56 - opendevin:INFO: mixin.py:30 - Sourced /opendevin/bash.bashrc and ~/.bashrc successfully
16:39:56 - opendevin:INFO: mixin.py:59 - Copied files from [/app/opendevin/runtime/plugins/agent_skills] to [/opendevin/plugins/agent_skills] inside sandbox.
16:39:56 - opendevin:INFO: mixin.py:67 - Initializing plugin [agent_skills] by executing [/opendevin/plugins/agent_skills/setup.sh] in the sandbox.
16:40:09 - opendevin:INFO: mixin.py:85 - Plugin agent_skills initialized successfully
16:40:09 - opendevin:INFO: mixin.py:30 - Sourced /opendevin/bash.bashrc and ~/.bashrc successfully
16:40:09 - opendevin:INFO: mixin.py:59 - Copied files from [/app/opendevin/runtime/plugins/jupyter] to [/opendevin/plugins/jupyter] inside sandbox.
16:40:09 - opendevin:INFO: mixin.py:67 - Initializing plugin [jupyter] by executing [/opendevin/plugins/jupyter/setup.sh] in the sandbox.
16:40:11 - opendevin:INFO: mixin.py:85 - Plugin jupyter initialized successfully
16:40:12 - opendevin:INFO: mixin.py:30 - Sourced /opendevin/bash.bashrc and ~/.bashrc successfully
16:40:12 - opendevin:INFO: browser_env.py:78 - Starting browser env...
16:40:12 - opendevin:ERROR: state.py:127 - Failed to restore state from session: sessions/dcde6a63-256a-44e8-9d53-801e88d2fa93/agent_state.pkl
Error restoring state sessions/dcde6a63-256a-44e8-9d53-801e88d2fa93/agent_state.pkl
16:40:12 - opendevin:INFO: agent_controller.py:140 - [Agent Controller dcde6a63-256a-44e8-9d53-801e88d2fa93] Starting step loop...
INFO:     172.17.0.1:49460 - "GET /api/list-files?path=%2F HTTP/1.1" 200 OK
16:40:19 - opendevin:INFO: browser_env.py:121 - Browser env started.
INFO:     172.17.0.1:41232 - "GET / HTTP/1.1" 200 OK
16:40:27 - opendevin:INFO: browser_env.py:128 - SHUTDOWN recv, shutting down browser env...
16:40:27 - opendevin:INFO: session.py:66 - WebSocket disconnected, sid: dcde6a63-256a-44e8-9d53-801e88d2fa93
16:40:27 - opendevin:INFO: agent_controller.py:145 - AgentController task was cancelled
INFO:     connection closed
INFO:     172.17.0.1:41232 - "GET /assets/index-CKCRqlAe.js HTTP/1.1" 200 OK
16:40:28 - opendevin:ERROR: listen.py:336 - Error getting OLLAMA models: HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/tags (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x788624295910>: Failed to establish a new connection: [Errno 111] Connection refused'))
Traceback (most recent call last):
  File "/app/.venv/lib/python3.12/site-packages/urllib3/connection.py", line 196, in _new_conn
    sock = connection.create_connection(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/urllib3/util/connection.py", line 85, in create_connection
    raise err
  File "/app/.venv/lib/python3.12/site-packages/urllib3/util/connection.py", line 73, in create_connection
    sock.connect(sa)
ConnectionRefusedError: [Errno 111] Connection refused

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/app/.venv/lib/python3.12/site-packages/urllib3/connectionpool.py", line 789, in urlopen
    response = self._make_request(
               ^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/urllib3/connectionpool.py", line 495, in _make_request
    conn.request(
  File "/app/.venv/lib/python3.12/site-packages/urllib3/connection.py", line 398, in request
    self.endheaders()
  File "/usr/local/lib/python3.12/http/client.py", line 1331, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/local/lib/python3.12/http/client.py", line 1091, in _send_output
    self.send(msg)
  File "/usr/local/lib/python3.12/http/client.py", line 1035, in send
    self.connect()
  File "/app/.venv/lib/python3.12/site-packages/urllib3/connection.py", line 236, in connect
    self.sock = self._new_conn()
                ^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/urllib3/connection.py", line 211, in _new_conn
    raise NewConnectionError(
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x788624295910>: Failed to establish a new connection: [Errno 111] Connection refused

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/app/.venv/lib/python3.12/site-packages/requests/adapters.py", line 667, in send
    resp = conn.urlopen(
           ^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/urllib3/connectionpool.py", line 843, in urlopen
    retries = retries.increment(
              ^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/urllib3/util/retry.py", line 519, in increment
    raise MaxRetryError(_pool, url, reason) from reason  # type: ignore[arg-type]
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/tags (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x788624295910>: Failed to establish a new connection: [Errno 111] Connection refused'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/app/opendevin/server/listen.py", line 329, in get_litellm_models
    ollama_models_list = requests.get(ollama_url, timeout=3).json()[
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/requests/api.py", line 73, in get
    return request("get", url, params=params, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/requests/api.py", line 59, in request
    return session.request(method=method, url=url, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/requests/sessions.py", line 589, in request
    resp = self.send(prep, **send_kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/requests/sessions.py", line 703, in send
    r = adapter.send(request, **kwargs)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/requests/adapters.py", line 700, in send
    raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/tags (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x788624295910>: Failed to establish a new connection: [Errno 111] Connection refused'))
INFO:     172.17.0.1:41240 - "GET /api/options/models HTTP/1.1" 200 OK
INFO:     172.17.0.1:41240 - "GET /api/options/agents HTTP/1.1" 200 OK
INFO:     ('172.17.0.1', 41254) - "WebSocket /ws" [accepted]
INFO:     connection open
16:40:30 - opendevin:INFO: agent.py:79 - Using runtime: server
16:40:30 - opendevin:INFO: ssh_box.py:135 - SSHBox is running as opendevin user with USER_ID=1000 in the sandbox
16:40:30 - opendevin:INFO: ssh_box.py:178 - Detected initial session.
16:40:30 - opendevin:INFO: ssh_box.py:180 - Creating new Docker container
16:40:30 - opendevin:WARNING: ssh_box.py:578 - Using port forwarding till the enable host network mode of Docker is out of experimental mode.Check the 897th issue on https://github.com/OpenDevin/OpenDevin/issues/ for more information.
16:40:30 - opendevin:INFO: ssh_box.py:586 - Mounting volumes: {'/home/sad/projects/cvechecker/opendevin-ollama/cvechecker-github-private': {'bind': '/workspace', 'mode': 'rw'}, '/tmp/cache': {'bind': '/home/opendevin/.cache', 'mode': 'rw'}}
16:40:31 - opendevin:INFO: ssh_box.py:597 - Container started
16:40:32 - opendevin:INFO: ssh_box.py:613 - waiting for container to start: 1, container status: running
16:40:33 - opendevin:INFO: ssh_box.py:342 - Connecting to SSH session...
16:40:33 - opendevin:INFO: ssh_box.py:345 - You can debug the SSH connection by running: `ssh -v -p 53913 opendevin@localhost` using the password '6d80fb6f-b55f-46ee-a712-31589c552425'
16:40:34 - opendevin:INFO: ssh_box.py:349 - Connected to SSH session
16:40:37 - opendevin:INFO: agent.py:100 - Creating agent CodeActAgent using LLM ollama/codellama
16:40:37 - opendevin:INFO: mixin.py:42 - Initializing plugins in the sandbox
16:40:37 - opendevin:INFO: mixin.py:30 - Sourced /opendevin/bash.bashrc and ~/.bashrc successfully
16:40:37 - opendevin:INFO: mixin.py:59 - Copied files from [/app/opendevin/runtime/plugins/agent_skills] to [/opendevin/plugins/agent_skills] inside sandbox.
16:40:37 - opendevin:INFO: mixin.py:67 - Initializing plugin [agent_skills] by executing [/opendevin/plugins/agent_skills/setup.sh] in the sandbox.
16:40:49 - opendevin:INFO: mixin.py:85 - Plugin agent_skills initialized successfully
16:40:50 - opendevin:INFO: mixin.py:30 - Sourced /opendevin/bash.bashrc and ~/.bashrc successfully
16:40:50 - opendevin:INFO: mixin.py:59 - Copied files from [/app/opendevin/runtime/plugins/jupyter] to [/opendevin/plugins/jupyter] inside sandbox.
16:40:50 - opendevin:INFO: mixin.py:67 - Initializing plugin [jupyter] by executing [/opendevin/plugins/jupyter/setup.sh] in the sandbox.
16:40:52 - opendevin:INFO: mixin.py:85 - Plugin jupyter initialized successfully
16:40:52 - opendevin:INFO: mixin.py:30 - Sourced /opendevin/bash.bashrc and ~/.bashrc successfully
16:40:52 - opendevin:INFO: browser_env.py:78 - Starting browser env...
16:40:52 - opendevin:ERROR: state.py:127 - Failed to restore state from session: sessions/c5f8ccb3-0c67-4397-ab8c-b5281f5f926f/agent_state.pkl
Error restoring state sessions/c5f8ccb3-0c67-4397-ab8c-b5281f5f926f/agent_state.pkl
16:40:52 - opendevin:INFO: agent_controller.py:140 - [Agent Controller c5f8ccb3-0c67-4397-ab8c-b5281f5f926f] Starting step loop...
INFO:     172.17.0.1:49254 - "GET /api/list-files?path=%2F HTTP/1.1" 200 OK
16:41:00 - opendevin:INFO: browser_env.py:121 - Browser env started.
16:41:10 - USER_ACTION
**MessageAction** (source=EventSource.USER)
CONTENT: hi
INFO:     172.17.0.1:46270 - "GET /api/list-files?path=%2F HTTP/1.1" 200 OK


==============
CodeActAgent LEVEL 0 LOCAL STEP 0 GLOBAL STEP 0


Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.


Provider List: https://docs.litellm.ai/docs/providers

16:41:10 - opendevin:ERROR: llm.py:114 - litellm.ServiceUnavailableError: OllamaException: HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/generate (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x788676271c70>: Failed to establish a new connection: [Errno 111] Connection refused')). Attempt #1 | You can customize these settings in the configuration.

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.


Provider List: https://docs.litellm.ai/docs/providers

16:41:13 - opendevin:ERROR: llm.py:114 - litellm.ServiceUnavailableError: OllamaException: HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/generate (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x788624307020>: Failed to establish a new connection: [Errno 111] Connection refused')). Attempt #2 | You can customize these settings in the configuration.

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.


Provider List: https://docs.litellm.ai/docs/providers

16:41:15 - opendevin:ERROR: llm.py:114 - litellm.ServiceUnavailableError: OllamaException: HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/generate (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x788624296db0>: Failed to establish a new connection: [Errno 111] Connection refused')). Attempt #3 | You can customize these settings in the configuration.

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.


Provider List: https://docs.litellm.ai/docs/providers

16:41:17 - opendevin:ERROR: llm.py:114 - litellm.ServiceUnavailableError: OllamaException: HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/generate (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x788624306e70>: Failed to establish a new connection: [Errno 111] Connection refused')). Attempt #4 | You can customize these settings in the configuration.

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.


Provider List: https://docs.litellm.ai/docs/providers

16:41:26 - opendevin:ERROR: llm.py:114 - litellm.ServiceUnavailableError: OllamaException: HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/generate (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7886243091c0>: Failed to establish a new connection: [Errno 111] Connection refused')). Attempt #5 | You can customize these settings in the configuration.

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.


Provider List: https://docs.litellm.ai/docs/providers

16:41:46 - opendevin:ERROR: llm.py:114 - litellm.ServiceUnavailableError: OllamaException: HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/generate (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x78862430a240>: Failed to establish a new connection: [Errno 111] Connection refused')). Attempt #6 | You can customize these settings in the configuration.

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.


Provider List: https://docs.litellm.ai/docs/providers

16:41:48 - opendevin:ERROR: llm.py:114 - litellm.ServiceUnavailableError: OllamaException: HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/generate (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x788624307b00>: Failed to establish a new connection: [Errno 111] Connection refused')). Attempt #7 | You can customize these settings in the configuration.

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.


Provider List: https://docs.litellm.ai/docs/providers

16:41:59 - opendevin:ERROR: llm.py:114 - litellm.ServiceUnavailableError: OllamaException: HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/generate (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x788624309430>: Failed to establish a new connection: [Errno 111] Connection refused')). Attempt #8 | You can customize these settings in the configuration.

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.


Provider List: https://docs.litellm.ai/docs/providers

16:42:15 - opendevin:ERROR: llm.py:114 - litellm.ServiceUnavailableError: OllamaException: HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/generate (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7886243084a0>: Failed to establish a new connection: [Errno 111] Connection refused')). Attempt #9 | You can customize these settings in the configuration.

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.


Provider List: https://docs.litellm.ai/docs/providers

16:44:38 - opendevin:ERROR: llm.py:114 - litellm.ServiceUnavailableError: OllamaException: HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/generate (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7886253066f0>: Failed to establish a new connection: [Errno 111] Connection refused')). Attempt #10 | You can customize these settings in the configuration.
Traceback (most recent call last):
  File "/app/.venv/lib/python3.12/site-packages/urllib3/connection.py", line 196, in _new_conn
    sock = connection.create_connection(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/urllib3/util/connection.py", line 85, in create_connection
    raise err
  File "/app/.venv/lib/python3.12/site-packages/urllib3/util/connection.py", line 73, in create_connection
    sock.connect(sa)
ConnectionRefusedError: [Errno 111] Connection refused

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/app/.venv/lib/python3.12/site-packages/urllib3/connectionpool.py", line 789, in urlopen
    response = self._make_request(
               ^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/urllib3/connectionpool.py", line 495, in _make_request
    conn.request(
  File "/app/.venv/lib/python3.12/site-packages/urllib3/connection.py", line 398, in request
    self.endheaders()
  File "/usr/local/lib/python3.12/http/client.py", line 1331, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/local/lib/python3.12/http/client.py", line 1091, in _send_output
    self.send(msg)
  File "/usr/local/lib/python3.12/http/client.py", line 1035, in send
    self.connect()
  File "/app/.venv/lib/python3.12/site-packages/urllib3/connection.py", line 236, in connect
    self.sock = self._new_conn()
                ^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/urllib3/connection.py", line 211, in _new_conn
    raise NewConnectionError(
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7886253066f0>: Failed to establish a new connection: [Errno 111] Connection refused

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/app/.venv/lib/python3.12/site-packages/requests/adapters.py", line 667, in send
    resp = conn.urlopen(
           ^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/urllib3/connectionpool.py", line 843, in urlopen
    retries = retries.increment(
              ^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/urllib3/util/retry.py", line 519, in increment
    raise MaxRetryError(_pool, url, reason) from reason  # type: ignore[arg-type]
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/generate (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7886253066f0>: Failed to establish a new connection: [Errno 111] Connection refused'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 2460, in completion
    generator = ollama.get_ollama_response(
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/litellm/llms/ollama.py", line 264, in get_ollama_response
    response = requests.post(
               ^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/requests/api.py", line 115, in post
    return request("post", url, data=data, json=json, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/requests/api.py", line 59, in request
    return session.request(method=method, url=url, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/requests/sessions.py", line 589, in request
    resp = self.send(prep, **send_kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/requests/sessions.py", line 703, in send
    r = adapter.send(request, **kwargs)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/requests/adapters.py", line 700, in send
    raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/generate (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7886253066f0>: Failed to establish a new connection: [Errno 111] Connection refused'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/app/opendevin/controller/agent_controller.py", line 143, in _start_step_loop
    await self._step()
  File "/app/opendevin/controller/agent_controller.py", line 392, in _step
    action = self.agent.step(self.state)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/agenthub/codeact_agent/codeact_agent.py", line 194, in step
    response = self.llm.completion(
               ^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 336, in wrapped_f
    return copy(f, *args, **kw)
           ^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 475, in __call__
    do = self.iter(retry_state=retry_state)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 376, in iter
    result = action(retry_state)
             ^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 418, in exc_check
    raise retry_exc.reraise()
          ^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 185, in reraise
    raise self.last_attempt.result()
          ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 449, in result
    return self.__get_result()
           ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
    raise self._exception
  File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 478, in __call__
    result = fn(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^
  File "/app/opendevin/llm/llm.py", line 154, in wrapper
    resp = completion_unwrapped(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 1028, in wrapper
    raise e
  File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 908, in wrapper
    result = original_function(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 2750, in completion
    raise exception_type(
          ^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 8072, in exception_type
    raise e
  File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 7745, in exception_type
    raise ServiceUnavailableError(
litellm.exceptions.ServiceUnavailableError: litellm.ServiceUnavailableError: OllamaException: HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/generate (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7886253066f0>: Failed to establish a new connection: [Errno 111] Connection refused'))
16:44:38 - opendevin:ERROR: agent_controller.py:149 - Error while running the agent: litellm.ServiceUnavailableError: OllamaException: HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/generate (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7886253066f0>: Failed to establish a new connection: [Errno 111] Connection refused'))
16:44:38 - opendevin:ERROR: agent_controller.py:150 - Traceback (most recent call last):
  File "/app/.venv/lib/python3.12/site-packages/urllib3/connection.py", line 196, in _new_conn
    sock = connection.create_connection(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/urllib3/util/connection.py", line 85, in create_connection
    raise err
  File "/app/.venv/lib/python3.12/site-packages/urllib3/util/connection.py", line 73, in create_connection
    sock.connect(sa)
ConnectionRefusedError: [Errno 111] Connection refused

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/app/.venv/lib/python3.12/site-packages/urllib3/connectionpool.py", line 789, in urlopen
    response = self._make_request(
               ^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/urllib3/connectionpool.py", line 495, in _make_request
    conn.request(
  File "/app/.venv/lib/python3.12/site-packages/urllib3/connection.py", line 398, in request
    self.endheaders()
  File "/usr/local/lib/python3.12/http/client.py", line 1331, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/local/lib/python3.12/http/client.py", line 1091, in _send_output
    self.send(msg)
  File "/usr/local/lib/python3.12/http/client.py", line 1035, in send
    self.connect()
  File "/app/.venv/lib/python3.12/site-packages/urllib3/connection.py", line 236, in connect
    self.sock = self._new_conn()
                ^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/urllib3/connection.py", line 211, in _new_conn
    raise NewConnectionError(
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7886253066f0>: Failed to establish a new connection: [Errno 111] Connection refused

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/app/.venv/lib/python3.12/site-packages/requests/adapters.py", line 667, in send
    resp = conn.urlopen(
           ^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/urllib3/connectionpool.py", line 843, in urlopen
    retries = retries.increment(
              ^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/urllib3/util/retry.py", line 519, in increment
    raise MaxRetryError(_pool, url, reason) from reason  # type: ignore[arg-type]
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/generate (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7886253066f0>: Failed to establish a new connection: [Errno 111] Connection refused'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 2460, in completion
    generator = ollama.get_ollama_response(
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/litellm/llms/ollama.py", line 264, in get_ollama_response
    response = requests.post(
               ^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/requests/api.py", line 115, in post
    return request("post", url, data=data, json=json, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/requests/api.py", line 59, in request
    return session.request(method=method, url=url, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/requests/sessions.py", line 589, in request
    resp = self.send(prep, **send_kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/requests/sessions.py", line 703, in send
    r = adapter.send(request, **kwargs)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/requests/adapters.py", line 700, in send
    raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/generate (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7886253066f0>: Failed to establish a new connection: [Errno 111] Connection refused'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/app/opendevin/controller/agent_controller.py", line 143, in _start_step_loop
    await self._step()
  File "/app/opendevin/controller/agent_controller.py", line 392, in _step
    action = self.agent.step(self.state)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/agenthub/codeact_agent/codeact_agent.py", line 194, in step
    response = self.llm.completion(
               ^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 336, in wrapped_f
    return copy(f, *args, **kw)
           ^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 475, in __call__
    do = self.iter(retry_state=retry_state)
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 376, in iter
    result = action(retry_state)
             ^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 418, in exc_check
    raise retry_exc.reraise()
          ^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 185, in reraise
    raise self.last_attempt.result()
          ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 449, in result
    return self.__get_result()
           ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
    raise self._exception
  File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 478, in __call__
    result = fn(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^
  File "/app/opendevin/llm/llm.py", line 154, in wrapper
    resp = completion_unwrapped(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 1028, in wrapper
    raise e
  File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 908, in wrapper
    result = original_function(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 2750, in completion
    raise exception_type(
          ^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 8072, in exception_type
    raise e
  File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 7745, in exception_type
    raise ServiceUnavailableError(
litellm.exceptions.ServiceUnavailableError: litellm.ServiceUnavailableError: OllamaException: HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/generate (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7886253066f0>: Failed to establish a new connection: [Errno 111] Connection refused'))

16:44:38 - OBSERVATION
ErrorObservation(content='There was an unexpected error while running the agent', observation='error')
INFO:     172.17.0.1:58784 - "GET /api/list-files?path=%2F HTTP/1.1" 200 OK

Yes. the local ollama LLM is running and I'm able to "talk" with through it's prompt.

May someone help me figure out what i'm doing wrong?

/Henrik

Current OpenDevin version

0.8.2

Installation and Configuration

# --pull=always \
docker run -it --rm \
    --add-host host.docker.internal:host-gateway \
    -e SANDBOX_USER_ID=$(id -u) \
    \
    -e LLM_API_KEY="ollama" \
    -e LLM_BASE_URL="http://host.docker.internal:11434" \
    -e LLM_OLLAMA_BASE_URL="http://host.docker.internal:11434" \
    \
    -e WORKSPACE_MOUNT_PATH=$WORKSPACE_BASE \
    -v $WORKSPACE_BASE:/opt/workspace_base \
    \
    -v /var/run/docker.sock:/var/run/docker.sock \
    -p 3000:3000 \
    --name opendevin-app-$(date +%Y%m%d%H%M%S) \
    ghcr.io/opendevin/opendevin:0.8.2

Model and Agent

  • Model: ollama/codellama
  • Agent: CodeActAgent

Operating System

Linux Ubuntu 24.04

Reproduction Steps

  1. Install and run codellama in linux ubuntu 24.04 host
  2. Start OpenDevin with above configuration (found in your documentation)
  3. Enter hi in prompt of OpenDevin
  4. Observe the error message

Logs, Errors, Screenshots, and Additional Context

No response

@HenrikBach1 HenrikBach1 added the bug Something isn't working label Jul 31, 2024
@dosubot dosubot bot added the installation Related to installation and setup label Jul 31, 2024
@mamoodi mamoodi removed the installation Related to installation and setup label Jul 31, 2024
@mamoodi
Copy link
Collaborator

mamoodi commented Jul 31, 2024

I'll see if I can get someone to take a look at this.

@HenrikBach1
Copy link
Author

As a replied to the thread of the kevin-support-bot:

"Probably, it may be a duplicate of OpenDevin#2844.

However, I'm not (yet) interested in to setup a local development environment to get things running."

@HenrikBach1 HenrikBach1 changed the title Problem with using Ollama/codellama as local LLM engine Problem with using Ollama/codellama as local LLM engine in Ubuntu 24.04 Aug 1, 2024
@MichaelKarpe
Copy link

Same issue trying the following:

sudo docker run -it \
    --pull=always \
    -e SANDBOX_USER_ID=$(id -u) \
    -e WORKSPACE_MOUNT_PATH=$WORKSPACE_BASE \
    -v $WORKSPACE_BASE:/opt/workspace_base \
    -v /var/run/docker.sock:/var/run/docker.sock \
    -p 3000:3000 \
    --add-host host.docker.internal:host-gateway \
    -e LLM_MODEL="ollama/llama3.1" \
    -e LLM_API_KEY="ollama" \
    -e LLM_BASE_URL="http://host.docker.internal:11434" \
    -e LLM_OLLAMA_BASE_URL="http://host.docker.internal:11434" \
    --name opendevin-app-$(date +%Y%m%d%H%M%S) \
    ghcr.io/opendevin/opendevin:0.8.2

@molander
Copy link

molander commented Aug 5, 2024

Easy as pie. I'm a FreeBSD guy, so my Docker-fu is very weak and I don't really use Ollama, but with LocalAI, it is a cinch. Normally, I run the build version but tried this on an Ubuntu 22 LTS, a fresh Mint and just firing it up on Debian:

Grab LocalAI via the bash installer if you really want easy:

curl https://localai.io/install.sh | sh

You can specify options while grabbing, but if you don't, the script will try to do the right thing (ie, if you have Docker and CUDA runtime container toolkit, it'll go that route)

Fair warning, the image is pretty large, cuz it has a lot pre-baked in (tts, stt, text to image gen, etc etc). As far as I know, LocalAI was the first and only opensource project to maintain feature parity with OpenAI API nearly from the beginning, while it seems others are bolting on and starting to catch up, LocalAI is maturing on all fronts and even has a simple and elegant web-ui for browsing models, quick prompt/inference testing, etc.

Or, you could download the binary from the Github. Or, if you are a very high level Warlock, you could attempt to build from source, but be warned, many submodules make it sometimes tricky!

At any rate, once the script is finished, it will tell you LocalAI is up and running on *:8080, if you didn't change it.

Now, what I usually do is have LiteLLM proxy running:

 litellm --model openai/gpt-3.5-turbo --host 0.0.0.0 --port 11434 --api_base 'http://10.10.10.10:8080/v1' --detailed_debug --alias synthia34b-awq

That way, I can swap llm's out underneath whatever app(s) on whichever boxen.

But you could do an ssh -L proxy if you need to work around docker host internal networking issues:

 ssh -L 0.0.0.0:11434:192.168.50.65:8080 localhost -N     

Here's an OpenDevin startup that will have you right as rain:

#!/usr/bin/env bash

export OPENAI_API_BASE=http://192.168.50.41:8080/v1
export WORKSPACE_BASE=/home/matto/workspace

docker run \
    -it \
    --add-host host.docker.internal:host-gateway \
    -e SANDBOX_USER_ID="opendevin" \
    -e LLM_API_KEY="stfu" \
    -e LLM_MODEL="openai/gpt-3.5-turbo" \
    -e LLM_BASE_URL="http://172.18.0.1:11434" \
    -e WORKSPACE_MOUNT_PATH=$WORKSPACE_BASE \
    -v $WORKSPACE_BASE:/opt/workspace_base \
    -v /var/run/docker.sock:/var/run/docker.sock \
    -p 3000:3000 \
    ghcr.io/opendevin/opendevin:latest
  
# Plus whatever else you crazy Docker kids do!
    ##--pull=always 
    #-e SANDBOX_USER_ID=$(id -u) 

FYI: I'm sure I am doing it wrong, but when wrong works right, I'm ok with it.
Just tried OpenDevin out this last weekend and wow, what a fantastic project! Thanks for open-sourcing and keep up the killer work.

I'll see about getting some of my MemGPT prompts going in here and post if I find anything interesting :D

@HenrikBach1
Copy link
Author

@molander

Thank you for pointing me in the direction of using LiteLLM as a proxy between external/local LLM's and OpenDevin. it did the trick getting codellama and OpenDevin talking together.

/Henrik

@HenrikBach1
Copy link
Author

HenrikBach1 commented Aug 8, 2024

And then after some research: Not

I wonder why the above environment variables aren't reflected in the OpenDevin's Settings, due to that my observations are that OpenDevin depends on settings, from the Settings menu, regardless of the above given environment variables?

@mamoodi
Copy link
Collaborator

mamoodi commented Aug 8, 2024

There is some discussion on the ultimate configs OpenDevin uses: #3220

I believe the settings that the UI has is the most important, then whatever is in the configs. So yes some of the settings that are in the UI will override the configs...

@mamoodi mamoodi added the severity:medium Affecting multiple users label Aug 14, 2024
@frankgestrada
Copy link

frankgestrada commented Aug 20, 2024

I had the same issue with OpenDevin not communicating with my local Llama llm. I had to start the ollama serve with the following:

OLLAMA_HOST=0.0.0.0:11434 OLLAMA_ORIGINS=* ollama serve

which allows communications with docker containers. Then run the rest as usual:
Note: “\” should appear at the end of each line below but for some reason it does not show up when I post.

docker run
-it
--pull=always
--network host
--add-host host.docker.internal:host-gateway
-e SANDBOX_USER_ID=$(id -u)
-e LLM_API_KEY="ollama"
-e LLM_BASE_URL="http://host.docker.internal:11434"
-e LLM_OLLAMA_BASE_URL="http://host.docker.internal:11434"
-e WORKSPACE_MOUNT_PATH=$WORKSPACE_BASE
-v $WORKSPACE_BASE:/opt/workspace_base
-v /var/run/docker.sock:/var/run/docker.sock
-p 3000:3000
ghcr.io/opendevin/opendevin:main

Copy link
Contributor

This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 7 days.

@github-actions github-actions bot added the Stale Inactive for 30 days label Sep 20, 2024
Copy link
Contributor

This issue was closed because it has been stalled for over 30 days with no activity.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Sep 27, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working severity:medium Affecting multiple users Stale Inactive for 30 days
Projects
None yet
Development

No branches or pull requests

5 participants