Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Anthropic stream flag first reasoning message as final with AWS Bedrock #762

Open
celeriev opened this issue Jan 24, 2025 · 3 comments
Open
Labels
more info More information required

Comments

@celeriev
Copy link

celeriev commented Jan 24, 2025

With my agent connected to anthropic. If I do run() the result will be the last message of the agent, resulting from the different tool call. However using:

async with agent.run_stream(user_prompt=message.content) as result:
      async for the token in result.stream():
          await msg.stream_token(token, is_sequence=True)

I will stream only the first message made by the agent, basically explaining that it will do a tool call. I expect to stream the last message only.

Do I need to configure this differently? Thanks

@sydney-runkle
Copy link
Member

@celeriev,

Thanks for the report. Could you please give a bit more context - maybe a snippet with your agent, etc?

@sydney-runkle sydney-runkle added the more info More information required label Jan 24, 2025
@celeriev
Copy link
Author

Hello @sydney-runkle. Thanks for your help. I will work on building a small snippet. Generally, this is a simple agent with one tool, retrieving data from a DB. Instead of returning the summary, like what I get with agent.run(), agent.run_stream will output the result of the first LLM call, for example, sthg like: "I will help you with this task, Let me fetch the data".

@celeriev
Copy link
Author

celeriev commented Jan 27, 2025

Hello @sydney-runkle , sorry for the initial issue, that was missing enough information indeed
Here is a snippet of the code that is currently making an issue:

from pydantic_ai import Agent
from pydantic_ai.models.anthropic import AnthropicModel
from anthropic import AsyncAnthropicBedrock
from random import randint
import asyncio

def get_random_number():
    """
    Return a random number using the randint
    """
    res = str(randint(0, 100))
    print("Tool Called with result: ", res)
    return res

agent = Agent(
    AnthropicModel(model_name="anthropic.claude-3-5-sonnet-20241022-v2:0", anthropic_client=AsyncAnthropicBedrock(aws_region="us-west-2")),
    system_prompt="Return a random number using your tools",
    retries=3,
    tools=[get_random_number],
)

async def main():
    async with agent.run_stream(user_prompt="Can you generate a random number?") as result:
        async for token in result.stream():
            print(token)

    print("Done with agent_stream")

    sync_result = await agent.run(user_prompt="Can you generate a random number?")
    print(sync_result.data)

if __name__ == "__main__":
    asyncio.run(main())

It will result with:

I'll
I'll help you generate a random number
I'll help you generate a random number using the `get
I'll help you generate a random number using the `get_random_number
I'll help you generate a random number using the `get_random_number` function.
I'll help you generate a random number using the `get_random_number` function.
Tool Called with result:  1
Done with agent_stream
Tool Called with result:  78
The random number generated is 78. This number was generated using the provided tool. Would you like me to generate another random number for you?

Also the first way of calling the agent (run_stream), will not lead to a last call to claude, compared to the one with run, where the output results from the output of the tool + claude request

@celeriev celeriev changed the title Anthropic Streaming stream first message only Anthropic stream only first reasoning message with AWS Bedrock Jan 28, 2025
@celeriev celeriev changed the title Anthropic stream only first reasoning message with AWS Bedrock Anthropic stream flag first reasoning message as final with AWS Bedrock Jan 29, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
more info More information required
Projects
None yet
Development

No branches or pull requests

2 participants