You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The preprompt is missing in the rendered chat prompt when a user inputs a message. The expected behavior is for the preprompt to appear before the user's message, but it does not.
Steps to reproduce
Set the chatPromptTemplate as follows: <s>{{#each messages}}{{#ifUser}}[INST] {{#if @first}}{{@root.preprompt}}\n{{/if}}{{content}} [/INST]{{/ifUser}}{{#ifAssistant}}{{content}}</s>{{/ifAssistant}}{{/each}}
Set the preprompt to "You are an AI assistant".
User inputs "Hello, world!".
Render the prompt using the following code (src/lib/buildPrompt.ts):
"preprompt": "You are an AI assistant",
"chatPromptTemplate": "<s>{{#each messages}}{{#ifUser}}[INST] {{#if @first}}{{@root.preprompt}}\n{{/if}}{{content}} [/INST]{{/ifUser}}{{#ifAssistant}}{{content}}</s>{{/ifAssistant}}{{/each}}",
Notes
The preprompt should be included in the rendered chat prompt, but it is not appearing. This issue needs to be addressed to ensure the correct behavior of the chat application.
The text was updated successfully, but these errors were encountered:
I noticed the same issue.
When using OpenAI type with the Gemini endpoint, the system content will always be empty except for assistants.
preprompt specified in the assistant ui is correctly appended to the initial system content.
{"model":"google/gemini-1.5-pro-001","messages":[{"role":"system","content":"You are an assistant"},{"role":"user","content":[{"type":"text","text":"hello, is this working ?"}]}],"stream":true,"max_tokens":4096,"temperature":null,"top_p":null,"frequency_penalty":null}
preprompt specified in the .env is not reflected when using the "raw" model.
{"model":"google/gemini-1.5-pro-001","messages":[{"role":"system","content":""},{"role":"user","content":[{"type":"text","text":"hello, is this working ?"}]}],"stream":true,"max_tokens":4096,"temperature":null,"top_p":null,"frequency_penalty":null}
To be noted, Gemini openai compatible api is not supporting empty content so I currently can't use the endpoint.
curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json" \
https://europe-west4-aiplatform.googleapis.com/v1beta1/projects/${PROJECT_ID}/locations/europe-west4/endpoints/openapi/chat/completions \
-d '{"model":"google/gemini-1.5-pro-001","messages":[{"role":"system","content":""},{"role":"user","content":[{"type":"text","text":"hello, is this working "}]}],"stream":true,"max_tokens":4096,"temperature":null,"top_p":null,"frequency_penalty":null}'
[{
"error": {
"code": 400,
"message": "Expected a string 'content' field of a(n) 'system' message to be non-empty.",
"status": "INVALID_ARGUMENT"
}
}
]%
Bug description
The preprompt is missing in the rendered chat prompt when a user inputs a message. The expected behavior is for the preprompt to appear before the user's message, but it does not.
Steps to reproduce
Set the
chatPromptTemplate
as follows:<s>{{#each messages}}{{#ifUser}}[INST] {{#if @first}}{{@root.preprompt}}\n{{/if}}{{content}} [/INST]{{/ifUser}}{{#ifAssistant}}{{content}}</s>{{/ifAssistant}}{{/each}}
Set the
preprompt
to "You are an AI assistant".User inputs "Hello, world!".
Render the prompt using the following code (src/lib/buildPrompt.ts):
Redered prompt is
<s>[INST] Hello, world![/INST]
.Observe the output. The preprompt "You are an AI assistant" is missing.
Specs
Config
Notes
The preprompt should be included in the rendered chat prompt, but it is not appearing. This issue needs to be addressed to ensure the correct behavior of the chat application.
The text was updated successfully, but these errors were encountered: