[Bug]: missing role for choice 0 #1222
-
Contact DetailsWhat happened?Hi! Thanks for your project! Steps to Reproduce
What browsers are you seeing the problem on?Chrome Relevant log outputchat-meilisearch | [2023-11-26T10:08:11Z INFO actix_web::middleware::logger] 172.19.0.4 "POST /indexes/messages/documents HTTP/1.1" 202 136 "-" "undici" 0.002903
chat-meilisearch | [2023-11-26T10:08:11Z INFO index_scheduler::batch] document addition done: DocumentAdditionResult { indexed_documents: 1, number_of_documents: 25 }
chat-meilisearch | [2023-11-26T10:08:11Z INFO index_scheduler] A batch of tasks was successfully completed.
LibreChat | {"level":"info","message":"[Login] [Login successful] [Username: <USER NAME>] [Request-IP: 10.234.129.178]","timestamp":"2023-11-26T10:09:49.292Z"}
chat-meilisearch | [2023-11-26T10:09:49Z INFO actix_web::middleware::logger] 172.19.0.4 "GET /health HTTP/1.1" 200 22 "-" "undici" 0.000338
LibreChat | ask log
LibreChat | {
LibreChat | text: 'How to build helicopter',
LibreChat | conversationId: null,
LibreChat | endpointOption: {
LibreChat | endpoint: 'openAI',
LibreChat | chatGptLabel: null,
LibreChat | promptPrefix: null,
LibreChat | modelOptions: {
LibreChat | model: 'gpt-3.5-turbo-0613',
LibreChat | temperature: 1,
LibreChat | top_p: 1,
LibreChat | presence_penalty: 0,
LibreChat | frequency_penalty: 0
LibreChat | }
LibreChat | }
LibreChat | }
LibreChat | maxContextTokens 4095
LibreChat | maxContextTokens 4095
LibreChat | Loading history for conversation 9f50946e-1361-454e-adee-89d2b1cf53bf 00000000-0000-0000-0000-000000000000
LibreChat | remainingContextTokens, this.maxContextTokens (1/2) 4084 4095
LibreChat | remainingContextTokens, this.maxContextTokens (2/2) 4084 4095
LibreChat | <-------------------------PAYLOAD/TOKEN COUNT MAP------------------------->
LibreChat | Payload: [ { role: 'user', content: 'How to build helicopter' } ]
LibreChat | Token Count Map: { '198fa90e-b440-45af-9e64-ae9e26e9a67d': 8 }
LibreChat | Prompt Tokens 11 remainingContextTokens 4084 this.maxContextTokens 4095
LibreChat | { '198fa90e-b440-45af-9e64-ae9e26e9a67d': 8, instructions: undefined }
LibreChat | userMessage.tokenCount 8
LibreChat | userMessage {
LibreChat | messageId: '198fa90e-b440-45af-9e64-ae9e26e9a67d',
LibreChat | parentMessageId: '00000000-0000-0000-0000-000000000000',
LibreChat | conversationId: '9f50946e-1361-454e-adee-89d2b1cf53bf',
LibreChat | sender: 'User',
LibreChat | text: 'How to build helicopter',
LibreChat | isCreatedByUser: true,
LibreChat | tokenCount: 8
LibreChat | }
chat-meilisearch | [2023-11-26T10:10:05Z INFO actix_web::middleware::logger] 172.19.0.4 "POST /indexes/messages/documents HTTP/1.1" 202 136 "-" "undici" 0.003237
chat-meilisearch | [2023-11-26T10:10:05Z INFO actix_web::middleware::logger] 172.19.0.4 "GET /indexes/convos/documents/9f50946e-1361-454e-adee-89d2b1cf53bf HTTP/1.1" 404 189 "-" "undici" 0.000423
chat-meilisearch | [2023-11-26T10:10:05Z INFO index_scheduler::batch] document addition done: DocumentAdditionResult { indexed_documents: 1, number_of_documents: 26 }
LibreChat | [Meilisearch] Convo not found and will index 9f50946e-1361-454e-adee-89d2b1cf53bf
LibreChat | baseURL https://openai-proxy.<PROXY URL>/public/v1
LibreChat | modelOptions {
LibreChat | model: 'gpt-3.5-turbo-0613',
LibreChat | temperature: 1,
LibreChat | top_p: 1,
LibreChat | presence_penalty: 0,
LibreChat | frequency_penalty: 0,
LibreChat | stop: [ '||>', '\nUser:', '<|diff_marker|>' ],
LibreChat | user: '65630dee349199df4bb0634f',
LibreChat | stream: true,
LibreChat | messages: [ { role: 'user', content: 'How to build helicopter' } ]
LibreChat | }
chat-meilisearch | [2023-11-26T10:10:05Z INFO index_scheduler] A batch of tasks was successfully completed.
chat-meilisearch | [2023-11-26T10:10:05Z INFO actix_web::middleware::logger] 172.19.0.4 "POST /indexes/convos/documents HTTP/1.1" 202 134 "-" "undici" 0.004077
chat-meilisearch | [2023-11-26T10:10:05Z INFO index_scheduler::batch] document addition done: DocumentAdditionResult { indexed_documents: 1, number_of_documents: 11 }
chat-meilisearch | [2023-11-26T10:10:05Z INFO index_scheduler] A batch of tasks was successfully completed.
LibreChat | [OpenAIClient.chatCompletion][stream] Unhandled error type
LibreChat | [OpenAIClient.chatCompletion][finalChatCompletion] Unhandled error type
LibreChat | [OpenAIClient.chatCompletion] Unhandled error type
LibreChat | Error: Error: missing role for choice 0
LibreChat | at OpenAIClient.chatCompletion (/app/api/app/clients/OpenAIClient.js:778:15)
LibreChat | at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
LibreChat | at async OpenAIClient.sendCompletion (/app/api/app/clients/OpenAIClient.js:411:15)
LibreChat | at async OpenAIClient.sendMessage (/app/api/app/clients/BaseClient.js:451:24)
LibreChat | at async /app/api/server/routes/ask/openAI.js:97:20
LibreChat | Error: Error: missing role for choice 0
LibreChat | at OpenAIClient.chatCompletion (/app/api/app/clients/OpenAIClient.js:778:15)
LibreChat | at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
LibreChat | at async OpenAIClient.sendCompletion (/app/api/app/clients/OpenAIClient.js:411:15)
LibreChat | at async OpenAIClient.sendMessage (/app/api/app/clients/BaseClient.js:451:24)
LibreChat | at async /app/api/server/routes/ask/openAI.js:97:20
LibreChat | TypeError: Cannot destructure property 'abortController' of 'abortControllers.get(...)' as it is undefined.
LibreChat | at abortMessage (/app/api/server/middleware/abortMiddleware.js:14:11)
LibreChat | at handleAbortError (/app/api/server/middleware/abortMiddleware.js:111:20)
LibreChat | at /app/api/server/routes/ask/openAI.js:141:5
LibreChat | at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
chat-meilisearch | [2023-11-26T10:10:16Z INFO actix_web::middleware::logger] 172.19.0.4 "POST /indexes/messages/documents HTTP/1.1" 202 136 "-" "undici" 0.002529
chat-meilisearch | [2023-11-26T10:10:16Z INFO index_scheduler::batch] document addition done: DocumentAdditionResult { indexed_documents: 1, number_of_documents: 27 }
Also my proxy example output:
curl -X 'POST' \
'https://<PROXY URL>/public/v1/chat/completions' \
-H 'accept: application/json' \
-H 'Authorization: Bearer eyJ---<TOKEN>---g' \
-H 'Content-Type: application/json' \
-d '{
"model": "gpt-3.5-turbo",
"messages": [{"role": "user", "content": "what is color of rainbow?"}]
}'
{
"id": "chatcmpl-8P6OlBiw9ezDPMvAWwPvXLP4Lep2J",
"object": "chat.completion",
"created": 1700993487,
"model": "gpt-3.5-turbo-0613",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "The colors of a rainbow are red, orange, yellow, green, blue, indigo, and violet.",
"name": "",
"function_call": null,
"tool_calls": [],
"tool_call_id": ""
},
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 13,
"completion_tokens": 22,
"total_tokens": 35
},
"system_fingerprint": ""
} ScreenshotsCode of Conduct
|
Beta Was this translation helpful? Give feedback.
Replies: 4 comments 8 replies
-
It looks like you’re on an older version. Update to latest and try again (it doesn’t look like you’re on latest) make sure you’re in the /c/ route and not /chat/ as it is now deprecated |
Beta Was this translation helpful? Give feedback.
-
Update instructions are here: https://docs.librechat.ai/install/docker_compose_install.html#updating-librechat |
Beta Was this translation helpful? Give feedback.
-
Also are you using another project for the reverse proxy setup? It would be helpful for testing to try it out if you could give me a link |
Beta Was this translation helpful? Give feedback.
-
I ran into this today as well. Again I'm using a reverse proxy... gpt-4 on my end today. The response generated for the most part and then randomly threw the error and cut out.
|
Beta Was this translation helpful? Give feedback.
You're welcome! It should be resolved now on the latest update if you want to try it out and let me know!