400 bad request after LLM response #1460
johan456789
started this conversation in
Help
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I seem to have get the frontend and vercel running. It will response to my query but after the chat completed the request will results in 400 error.
Also, the chat history doesn't seem to be persisted. Here's what it shows after leaving and coming back to the chat, an empty page:
What might be the cause of error? Have any one successfully setup their own hosted version without issues?
Beta Was this translation helpful? Give feedback.
All reactions