Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

我convex对ollama的api的访问一直被禁止 #251

Open
jiajiahard opened this issue Sep 16, 2024 · 9 comments
Open

我convex对ollama的api的访问一直被禁止 #251

jiajiahard opened this issue Sep 16, 2024 · 9 comments

Comments

@jiajiahard
Copy link

[GIN] 2024/09/16 - 13:05:29 | 403 | 39.122µs | 127.0.0.1 | POST "/api/embeddings"
9/16/2024, 12:54:28 PM [CONVEX A(aiTown/agentOperations:agentGenerateMessage)] [LOG] 'Sending data for embedding: {"model":"mxbai-embed-large","prompt":"Alice is talking to Bob"}'
9/16/2024, 12:54:29 PM [CONVEX A(aiTown/agentOperations:agentGenerateMessage)] Uncaught SyntaxError: Unexpected end of JSON input
at parse [as parse] ()
at json [as json] (../../udf-runtime/src/23_response.ts:217:2)
at async (../../convex/util/llm.ts:658:5)
at async retryWithBackoff (../../convex/util/llm.ts:250:22)
at async ollamaFetchEmbedding (../../convex/util/llm.ts:645:21)
at async (../../convex/util/llm.ts:153:6)
at async Promise.all (index 0)all [as all] ()
at async fetchEmbeddingBatch (../../convex/util/llm.ts:152:18)
at async fetchBatch (../../convex/agent/embeddingsCache.ts:29:17)
at async fetch (../../convex/agent/embeddingsCache.ts:10:16)
已经设置了Tunnelmole的转发,但是访问一直被禁止

@jiajiahard
Copy link
Author

之前没用Tunnelmole转发的时候,也是一直被禁止,无法访问,我使用的windows的虚拟机

@ianmacartney
Copy link
Collaborator

ianmacartney commented Sep 16, 2024 via email

@jiajiahard
Copy link
Author

我正常在自己端口是可以发送请求和收到消息的,但是启动项目以后,项目里发的请求都会被禁止,但是我在其他终端发请求还是正常可以接受

@jiajiahard
Copy link
Author

还有我发现一个问题,按照你们的操作指南,我在设置用socat构建convex和ollama的通信之后,我无法启动ollama,因为11434这个端口进程冲突了

@ianmacartney
Copy link
Collaborator

If you're already running ollama on 11434 then you don't need to start a second process. I'm not sure what's conflicting with 11434.
Maybe the requests are failing because it's in a container where it can't find the port. You can try putting in console logs around the request to test

@jiajiahard
Copy link
Author

执行以下命令桥接端口,允许 Convex 和 Ollama 通信。

socat TCP-LISTEN:11434,fork TCP:$(cat /etc/resolv.conf | grep nameserver | awk '{print $2}'):11434 &

这是你们指南其中的一个操作,如果我执行这个命令,我将无法启动ollama,因为它们都将占用11434端口

@ianmacartney
Copy link
Collaborator

Apologies - I haven't done those steps, as I'm using a Mac. I wonder if you could start Ollama first before executing that command. Otherwise there may be other windows users who can assist in the Discord. I'm not sure why socat is necessary tbh

@quanchentg
Copy link

@jiajiahard 解决了吗?

@ianmacartney
In my case, ollama service (running on local) even doesn't receive the post request from convex.

and log shows 2024/10/17 12:32:03 [CONVEX A(aiTown/agentOperations:agentGenerateMessage)] Uncaught Error: Request to http://localhost:11434/api/embeddings forbidden

all my projects and service running on local.

@leol15
Copy link

leol15 commented Oct 27, 2024

@quanchentg I think your issue might be the same as #249 (comment)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants