Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to set the graphRAG with local ollama #212

Closed
edenbuaa opened this issue Sep 4, 2024 · 7 comments
Closed

How to set the graphRAG with local ollama #212

edenbuaa opened this issue Sep 4, 2024 · 7 comments
Labels
bug Something isn't working

Comments

@edenbuaa
Copy link

edenbuaa commented Sep 4, 2024

Description

How to set the graphRAG with local ollama?

Reproduction steps

1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See error

Screenshots

![DESCRIPTION](LINK.png)

Logs

No response

Browsers

No response

OS

No response

Additional information

No response

@edenbuaa edenbuaa added the bug Something isn't working label Sep 4, 2024
@mkhludnev
Copy link
Contributor

mkhludnev commented Sep 5, 2024

I briefly look through code noticed explicit reference to OpenAI eg

text_embedder = OpenAIEmbedding(
, it might not be ready for local llms

@mkhludnev
Copy link
Contributor

Another issue is that graphrag is a separate program with own configuration. Which could be configured to other llm but it's a separate question microsoft/graphrag#657

@mkhludnev
Copy link
Contributor

I briefly look through code noticed explicit reference to OpenAI eg

text_embedder = OpenAIEmbedding(

, it might not be ready for local llms

I stuck here

Entity count: 318
2024-09-09 19:15:38.962 - openai._base_client/DEBUG - Sending HTTP Request: POST https://api.openai.com/v1/embeddings

Error embedding chunk {'OpenAIEmbedding': "Error code: 

I think it should pick default configured embeddings instead.

@edenbuaa
Copy link
Author

@mkhludnev thanks. I can build the graphRAG and run the pipeline successfully with my local ollama after a investigation about graphrag component.

But I got error which point to the mismatch of embedding between openai and ollama during the query stage. This error is hidden in the kotaemon pipeline, seems a lot of work to fix it.

@kksasa
Copy link

kksasa commented Sep 10, 2024

if someone can run ollama for graphrag. Please share the steps to us :)

@edenbuaa
Copy link
Author

Finally, it works. I will make a PR to this repo

@taprosoft
Copy link
Collaborator

Please check the latest version and instruction for setting up GraphRAG here https://github.com/Cinnamon/kotaemon#setup-graphrag.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants