Replies: 1 comment
-
Theres two aspects to this anwer 1: how to to get previous interactions to be accessible in the future. For this, you'll want to use a checkpointer (https://langchain-ai.github.io/langgraph/how-tos/persistence/). This will keep the state that the graph finishes with to be accessible when starting the next time 2: okay so you have this state, but how do you use it? Default may be to pass a list of messages to the LLM. This is fine but may not be great if you have a lot of large messages. so you will probably want to manipulate the message list in some way - eg add a summary (https://langchain-ai.github.io/langgraph/how-tos/memory/add-summary-conversation-history/) |
Beta Was this translation helpful? Give feedback.
-
hi guys, anyone familiar with the concept of memory of the llm in langraph? made a kind of complex graph for a RAG-WebSearch Chatbot, but I do not find which way should I implement memory, given that I'm somewhat familiar to this.
i have a doubt about how to handle such context to not surpass the context window of the llm too.
The graph architecture of the chatbot is under /utils/chatbot_graph.py
https://github.com/pablocpz/RAI-ai-personalized-voice-news-reporter/tree/main
Beta Was this translation helpful? Give feedback.
All reactions