Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Macos Ollama : failed to create asking task + timeout #571

Open
emrecengdev opened this issue Jul 31, 2024 · 1 comment
Open

Macos Ollama : failed to create asking task + timeout #571

emrecengdev opened this issue Jul 31, 2024 · 1 comment
Labels
bug Something isn't working module/ai-service ai-service related

Comments

@emrecengdev
Copy link

Describe the bug
I am following this tutorial:
https://blog.getwren.ai/how-to-use-meta-llama-3-to-query-mysql-database-using-ollama-on-your-machine-2c087b204e41
I get this error when I enter prompt and ask a question:

CleanShot 2024-07-31 at 17 16 07

To Reproduce
Steps to reproduce the behavior:

  1. Follow the blog steps to install. Note that it is missing the part about Docker Desktop (not everyone has this by default). My data model loaded fine
  2. Click on Ask
  3. Seen error

Expected behavior
I expect Wren to presumably answer my question

Screenshots
Ollama is running successfully
CleanShot 2024-07-31 at 17 19 42

Desktop (please complete the following information):

  • OS: Macos Sonomas
  • Browser Arc (Chrome based)

Wren AI Information

  • Version: 0.7.1
  • LLM_PROVIDER= ollama_llm
  • GENERATION_MODEL= llama3

Additional context
Unlike other users, I noticed that after info ui service, other parts did not come. Ai service etc. After waiting for a long time, I got a timeout error.

CleanShot 2024-07-31 at 17 13 55

Logs and my env.ai attached. I think the issue similar to #511

wrenai-ibis-server.log
wrenai-wren-ai-service.log
wrenai-wren-engine.log
wrenai-wren-ui.log

here is .env.ai file
env.ai.txt

here is .env.dev file
env.dev.txt

@emrecengdev emrecengdev added the bug Something isn't working label Jul 31, 2024
@cyyeh
Copy link
Member

cyyeh commented Jul 31, 2024

@emrecengdev thanks for reaching out! sorry, I couldn't download env.ai.txt file, it said 404. also from the wren ai service log, it seems that you were using OpenAILLM provider instead of Ollama provider. Please check the value of LLM_PROVIDER in .env.ai file. Thank you

@cyyeh cyyeh added the module/ai-service ai-service related label Aug 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working module/ai-service ai-service related
Projects
None yet
Development

No branches or pull requests

2 participants