-
Notifications
You must be signed in to change notification settings - Fork 134
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error decoding response body: expected value at line 1 column 1 #99
Comments
This issue is stale because it has been open for 30 days with no activity. |
Hello, I also get the same error. Did you find a solution for it? |
Unfortunately not yet and I am still waiting for an answer or an update |
Hello, did you check the TGI logs? I assume the response body is not formatted correctly, there may be an issue with the way the response is parsed. |
The TGI output is fine and can be consumed both by langchain and chat-ui. TGI with codellama-34 can be consumed fine with python requests call. |
This issue is stale because it has been open for 30 days with no activity. |
I'm having the same(?) issue or at least the error message popup I see in VSCode is very similar:
Is there a good way to debug this? Possibly related: https://users.rust-lang.org/t/serde-expected-value-at-line-1-column-1/16605 |
I am trying to use llm-vscode with a locally deployed Text Generation Inference (TGI) server but I keep getting the following error:
Error decoding response body: expected value at line 1 column 1
My setting is the following where and correspond to my server path. I tried both with /generate and without it
The text was updated successfully, but these errors were encountered: