-
Notifications
You must be signed in to change notification settings - Fork 5.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
When AssistantAgent gets called consecutively without new user messages, getting "Expected last role User or Tool" error when using Mistral models #5044
Comments
I think it's because tool call results are not considered as agent "inner messages". |
Tool call results is also part of the inner messages: autogen/python/packages/autogen-agentchat/src/autogen_agentchat/agents/_assistant_agent.py Lines 383 to 384 in abbdbb2
Though the summary is returned as response directly if reflect on tool use is not set, like what you mentioned. Could you try to add a new assistant message to its model context before:
autogen/python/packages/autogen-agentchat/src/autogen_agentchat/agents/_assistant_agent.py Lines 399 to 403 in abbdbb2
autogen/python/packages/autogen-agentchat/src/autogen_agentchat/agents/_assistant_agent.py Lines 429 to 432 in abbdbb2
Let's see if adding these additional assistant message can help. |
Related #2828 |
If the messages with role = tool is indeed part of the inner messages, it is part of the discussion and set to the other agents. Correct? Then why the extra assistant message? {
"tool_calls": [
{
"id": "iRvM4muS4",
"function": {
"arguments": "{\"todo_id\": 493871141, \"project_id\": 45010942, \"target_url\": \"https://gitlab.com/lx-industries/wally-the-wobot/tests/repl-tests/-/issues/25#note_2296444245\", \"target_type\": \"Issue\", \"target_id\": 25}",
"name": "get_todo_discussion_id"
},
"type": "function"
}
],
"role": "assistant"
},
{
"content": "e7764e059fad9a55ff30dbd4b2bf108b5205e486",
"role": "tool",
"tool_call_id": "iRvM4muS4"
},
{
"content": "[{\"name\": \"list_issue_notes\", \"arguments\": {\"project_id\": 45010942, \"issue_iid\": 25, \"discussion_id\": \"e7764e059fad9a55ff30dbd4b2bf108b5205e486\"}}]",
"role": "assistant"
}
@ekzhu my understanding is those messages will have |
If I comment this block, then my swarm does not work anymore.
|
The extra assistant message comes from reflection on tool use: autogen/python/packages/autogen-agentchat/src/autogen_agentchat/agents/_assistant_agent.py Lines 405 to 416 in abbdbb2
When you comment out the whole block, it won't work because the code doesn't yield a Response anymore.
Yes, that's correct. I missed that. Is there a chat template somewhere mistral can accept? @jackgerrits do you think this is a case for model family and addressing the discrepancy between different model providers? |
@ekzhu IDK. Do you have an example for OpenAI I could refer to? |
I have found this: https://github.com/mistralai/cookbook/blob/main/concept-deep-dive/tokenization/chat_templates.md, I have never used this myself. |
@ekzhu how does that help? IMHO the problem is this message: {
"content": "[{\"name\": \"list_issue_notes\", \"arguments\": {\"project_id\": 45010942, \"issue_iid\": 25, \"discussion_id\": \"e7764e059fad9a55ff30dbd4b2bf108b5205e486\"}}]",
"role": "assistant"
} which is created by: autogen/python/packages/autogen-agentchat/src/autogen_agentchat/agents/_assistant_agent.py Lines 419 to 432 in abbdbb2
If call tool results are part of the inner messages and are sent to the other agents, what is the point of that extra |
Answering my own question: the The problem is that Event if |
I just found this, and I don't know how exactly it can help. Just thought it may be helpful.
The inner messages are not sent to other agents. They are yielded for observability purposes only. Only the application (caller) and the group chat manager (as part of the Team) has access to these messages. Other agents can only see the
Yes, I have already recognized this (I have edited the title and added this as an issue we would like to tackle). Though, to be clear the tool call summary message is not added to the model context, and the Response is not sent to the chat completion API, but to other agents. I understand when the same agent is called consecutively, it will be called with an assistant message as the last message in the context sent to the LLM. This is going to happen regardless of whether tool call was used or not. So this is going to cause error for model APIs that do not support "assistant, assistant, ..." in the message context.
I am thinking how the last message could have been a Can you post how the agents are setup? |
What happened?
On some occasions, the chat completion is called with it's last message's role set to "assistant".
Example:
The Mistral API does not support that:
What did you expect to happen?
I expect no errors.
My understanding is the next message is expected to be a handoff. But in Mistral's API's case, it's not possible.
How can we reproduce it (as minimally and precisely as possible)?
Run Mistral via LiteLLM:
How I run LiteLLM:
compose.yml
config.yml
Then, in my app:
AutoGen version
0.4.1
Which package was this bug in
AgentChat
Model used
mistral-large-latest
Python version
3.12
Operating system
Ubuntu 24.04
Any additional info you think would be helpful for fixing this bug
No response
The text was updated successfully, but these errors were encountered: