You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In order to switch the LLM and embedding modules in Graphrag to the local version, I deployed the ollama service and then configured the Graphrag configuration file according to the requirements https://github.com/TheAiSingularity/graphrag-local-ollama Make modifications, and finally an error occurs in the 'creat_final_dentities' step.
After reading the source code, it was found that Graphrag relies on OpenAI's Python API, which does not support the embedding of ollama.
Do you have any plans to support ollama as a model for LLM and embedding in the future?
thank you.
Steps to reproduce
No response
GraphRAG Config Used
No response
Logs and screenshots
No response
Additional Information
GraphRAG Version:
Operating System:
Python Version:
Related Issues:
The text was updated successfully, but these errors were encountered:
sysullf
added
the
triage
Default label assignment, indicates new issue needs reviewed by a maintainer
label
Jul 11, 2024
sysullf
changed the title
[Issue]: <title> create_final_entities when i use ollama embedding model
[Issue]: <title> create_final_entities error when i use ollama embedding model
Jul 11, 2024
Hi! We are centralizing other LLM discussions in these threads:
Other LLM/Api bases: #339,
Ollama: #345
Local embeddings: #370
Kindly appreciate if you can bring your question to those threads for community support.
I'll resolve this issue so we can keep the focus on those threads.
AlonsoGuevara
changed the title
[Issue]: <title> create_final_entities error when i use ollama embedding model
[Issue]: create_final_entities error when i use ollama embedding model
Jul 11, 2024
Describe the issue
In order to switch the LLM and embedding modules in Graphrag to the local version, I deployed the ollama service and then configured the Graphrag configuration file according to the requirements https://github.com/TheAiSingularity/graphrag-local-ollama Make modifications, and finally an error occurs in the 'creat_final_dentities' step.
After reading the source code, it was found that Graphrag relies on OpenAI's Python API, which does not support the embedding of ollama.
Do you have any plans to support ollama as a model for LLM and embedding in the future?
thank you.
Steps to reproduce
No response
GraphRAG Config Used
No response
Logs and screenshots
No response
Additional Information
The text was updated successfully, but these errors were encountered: