-
Notifications
You must be signed in to change notification settings - Fork 707
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
我convex对ollama的api的访问一直被禁止 #251
Comments
之前没用Tunnelmole转发的时候,也是一直被禁止,无法访问,我使用的windows的虚拟机 |
I would try validating that you can make that request locally - or print
the await request.text(): "Unexpected end of JSON input" usually means
there was some error instead
…On Sun, Sep 15, 2024 at 10:08 PM jiajiahard ***@***.***> wrote:
[GIN] 2024/09/16 - 13:05:29 | 403 | 39.122µs | 127.0.0.1 | POST
"/api/embeddings"
9/16/2024, 12:54:28 PM [CONVEX
A(aiTown/agentOperations:agentGenerateMessage)] [LOG] 'Sending data for
embedding: {"model":"mxbai-embed-large","prompt":"Alice is talking to Bob"}'
9/16/2024, 12:54:29 PM [CONVEX
A(aiTown/agentOperations:agentGenerateMessage)] Uncaught SyntaxError:
Unexpected end of JSON input
at parse [as parse] ()
at json [as json] (../../udf-runtime/src/23_response.ts:217:2)
at async (../../convex/util/llm.ts:658:5)
at async retryWithBackoff (../../convex/util/llm.ts:250:22)
at async ollamaFetchEmbedding (../../convex/util/llm.ts:645:21)
at async (../../convex/util/llm.ts:153:6)
at async Promise.all (index 0)all [as all] ()
at async fetchEmbeddingBatch (../../convex/util/llm.ts:152:18)
at async fetchBatch (../../convex/agent/embeddingsCache.ts:29:17)
at async fetch (../../convex/agent/embeddingsCache.ts:10:16)
已经设置了Tunnelmole的转发,但是访问一直被禁止
—
Reply to this email directly, view it on GitHub
<#251>, or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AACZQWYPDHZGT5OGVZRJAYLZWZRTFAVCNFSM6AAAAABOIQYNZCVHI2DSMVQWIX3LMV43ASLTON2WKOZSGUZDONJRGI3TCNY>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
|
我正常在自己端口是可以发送请求和收到消息的,但是启动项目以后,项目里发的请求都会被禁止,但是我在其他终端发请求还是正常可以接受 |
还有我发现一个问题,按照你们的操作指南,我在设置用socat构建convex和ollama的通信之后,我无法启动ollama,因为11434这个端口进程冲突了 |
If you're already running ollama on 11434 then you don't need to start a second process. I'm not sure what's conflicting with 11434. |
执行以下命令桥接端口,允许 Convex 和 Ollama 通信。 socat TCP-LISTEN:11434,fork TCP:$(cat /etc/resolv.conf | grep nameserver | awk '{print $2}'):11434 & 这是你们指南其中的一个操作,如果我执行这个命令,我将无法启动ollama,因为它们都将占用11434端口 |
Apologies - I haven't done those steps, as I'm using a Mac. I wonder if you could start Ollama first before executing that command. Otherwise there may be other windows users who can assist in the Discord. I'm not sure why |
@jiajiahard 解决了吗? @ianmacartney and log shows all my projects and service running on local. |
@quanchentg I think your issue might be the same as #249 (comment) |
[GIN] 2024/09/16 - 13:05:29 | 403 | 39.122µs | 127.0.0.1 | POST "/api/embeddings"
9/16/2024, 12:54:28 PM [CONVEX A(aiTown/agentOperations:agentGenerateMessage)] [LOG] 'Sending data for embedding: {"model":"mxbai-embed-large","prompt":"Alice is talking to Bob"}'
9/16/2024, 12:54:29 PM [CONVEX A(aiTown/agentOperations:agentGenerateMessage)] Uncaught SyntaxError: Unexpected end of JSON input
at parse [as parse] ()
at json [as json] (../../udf-runtime/src/23_response.ts:217:2)
at async (../../convex/util/llm.ts:658:5)
at async retryWithBackoff (../../convex/util/llm.ts:250:22)
at async ollamaFetchEmbedding (../../convex/util/llm.ts:645:21)
at async (../../convex/util/llm.ts:153:6)
at async Promise.all (index 0)all [as all] ()
at async fetchEmbeddingBatch (../../convex/util/llm.ts:152:18)
at async fetchBatch (../../convex/agent/embeddingsCache.ts:29:17)
at async fetch (../../convex/agent/embeddingsCache.ts:10:16)
已经设置了Tunnelmole的转发,但是访问一直被禁止
The text was updated successfully, but these errors were encountered: