[Question]: LLM on Ollama endpoints is not responding based on a document imported from the RAG API. #3920
Unanswered
SEOLJINYOUNG
asked this question in
Troubleshooting
Replies: 1 comment
-
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
What is your question?
LLM on Ollama endpoints is not responding based on a document imported from the RAG API.
EMBEDDINGS_PROVIDER=ollama
EMBEDDINGS_MODEL=nomic-embed-text
MODEL=llama3.1:70b
I took a log app\clients\OllamaClient.js,
and on the messages list
It contains two messages.
messages = [
{role: 'system', content:'content from rag api'},
{role: 'user', content:'user input'},
]
More Details
When I ask you to tell me the contents of the attached document, llm's answer says that you don't save or remember the previous conversation, or answers based on existing pre-learned content.
I'd appreciate it if you could give me some ideas on which parts to look at.
What is the main subject of your question?
Endpoints, User System/OAuth, Other
Screenshots
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions