-
Notifications
You must be signed in to change notification settings - Fork 151
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature request: Inject chat history before LLM calls #116
Comments
Thank you Shaojie! This is indeed useful to have and a good start, we'd love to merge this once it's in shape! Something I'm not too sure about: Do you really need the history injection? e.g., in your This is effectively what you are doing under the hood, but I think you could just extend the forward pass of LLM Call to pass not only the variable but also the history as a list to |
Hi @mertyg , thanks for your swift response! Good point! My short answer is that history injection is easier to implement. Long answerHistory through
|
Hi TextGrad developers,
First of all, thanks a lot for this great work!
I wonder do you have any plans to allow specifying chat history before the LLM calls? Below is some context explaining why this is important to me.
My task
Optimise the system prompt so that the chatbot behaves in a more controlled way as an interviewer. Therefore, the objective is not only to get a single good answer but also good responses after several turns of conversation.
Why can't I put the chat history in the prompt?
Most LLMs would understand it well if I put chat history in the prompts. However, to rule out any misalignment with inference, it's better to add history in the idiosyncratic way of each LLM. E.g., a list of dicts for OpenAI models.
MWE of my solution
Below is my solution. Although it's working, it looks a bit hacky so I'm calling it "history injection". Looking forward to your comments for a more proper implementation.
Best regards,
Shaojie Jiang
The text was updated successfully, but these errors were encountered: