Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Preprompt Missing in Rendered Chat Prompt #1445

Open
calycekr opened this issue Sep 3, 2024 · 1 comment
Open

Preprompt Missing in Rendered Chat Prompt #1445

calycekr opened this issue Sep 3, 2024 · 1 comment
Assignees
Labels
bug Something isn't working

Comments

@calycekr
Copy link

calycekr commented Sep 3, 2024

Bug description

The preprompt is missing in the rendered chat prompt when a user inputs a message. The expected behavior is for the preprompt to appear before the user's message, but it does not.

Steps to reproduce

  1. Set the chatPromptTemplate as follows:
    <s>{{#each messages}}{{#ifUser}}[INST] {{#if @first}}{{@root.preprompt}}\n{{/if}}{{content}} [/INST]{{/ifUser}}{{#ifAssistant}}{{content}}</s>{{/ifAssistant}}{{/each}}

  2. Set the preprompt to "You are an AI assistant".

  3. User inputs "Hello, world!".

  4. Render the prompt using the following code (src/lib/buildPrompt.ts):

    let prompt = model.chatPromptRender({
        messages: filteredMessages,
        preprompt,
        tools,
        toolResults,
    });
  5. Redered prompt is <s>[INST] Hello, world![/INST].

  6. Observe the output. The preprompt "You are an AI assistant" is missing.

Specs

Config

    "preprompt": "You are an AI assistant",
    "chatPromptTemplate": "<s>{{#each messages}}{{#ifUser}}[INST] {{#if @first}}{{@root.preprompt}}\n{{/if}}{{content}} [/INST]{{/ifUser}}{{#ifAssistant}}{{content}}</s>{{/ifAssistant}}{{/each}}",

Notes

The preprompt should be included in the rendered chat prompt, but it is not appearing. This issue needs to be addressed to ensure the correct behavior of the chat application.

@calycekr calycekr added the bug Something isn't working label Sep 3, 2024
@nsarrazin nsarrazin self-assigned this Sep 9, 2024
@pocman
Copy link

pocman commented Sep 12, 2024

I noticed the same issue.
When using OpenAI type with the Gemini endpoint, the system content will always be empty except for assistants.

preprompt specified in the assistant ui is correctly appended to the initial system content.

{"model":"google/gemini-1.5-pro-001","messages":[{"role":"system","content":"You are an assistant"},{"role":"user","content":[{"type":"text","text":"hello, is this working ?"}]}],"stream":true,"max_tokens":4096,"temperature":null,"top_p":null,"frequency_penalty":null}

preprompt specified in the .env is not reflected when using the "raw" model.

{"model":"google/gemini-1.5-pro-001","messages":[{"role":"system","content":""},{"role":"user","content":[{"type":"text","text":"hello, is this working ?"}]}],"stream":true,"max_tokens":4096,"temperature":null,"top_p":null,"frequency_penalty":null}

To be noted, Gemini openai compatible api is not supporting empty content so I currently can't use the endpoint.

curl -X POST \
    -H "Authorization: Bearer $(gcloud auth print-access-token)" \
    -H "Content-Type: application/json" \
  https://europe-west4-aiplatform.googleapis.com/v1beta1/projects/${PROJECT_ID}/locations/europe-west4/endpoints/openapi/chat/completions \
  -d '{"model":"google/gemini-1.5-pro-001","messages":[{"role":"system","content":""},{"role":"user","content":[{"type":"text","text":"hello, is this working "}]}],"stream":true,"max_tokens":4096,"temperature":null,"top_p":null,"frequency_penalty":null}'

[{
  "error": {
    "code": 400,
    "message": "Expected a string 'content' field of a(n) 'system' message to be non-empty.",
    "status": "INVALID_ARGUMENT"
  }
}
]%  

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants
@pocman @calycekr @nsarrazin and others