-
Notifications
You must be signed in to change notification settings - Fork 79
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Metadata to LLM Calls in Logfire #740
Comments
Usign spans allows you to do this, like this:
And llms in logfire right now are traced by provider (openai, anthropic), I agree that it would be great if they gone deeper into the llm specific filtering apart from doing it from spans, but not clear how without disrupting the openai, anthropic apis. Would be great if it can be done generally for all llm type traces. |
@antoni0z can you please clarify this? |
Yes, I think that having special attributes for llms is needed. Because right now, this is being done specific for openai traces and anthropic traces. So if I want to do a dashboard for the tokens consumed is difficult to do It on a general way on the llms because I have to filter the anthropic way and the openai way. If for each llm call some attributes are added, like tokens spent, function calls used, and other llm specific attributes this will help with better general llm filtering. And also will help libraries like lanchain to be able to integrate better with logfire, if it follows this tracing abstractions. Its just an idea so take it with a grain of salt, what do you think? Does logfire have some of this in mind to become even more helpful in the llms specific context? |
Description
Implement a feature in Logfire that allows users to attach custom metadata to each LLM API call. This enhancement would enable more detailed tracking, filtering, and analysis of LLM interactions within the Logfire observability platform and better understanding the tokens consumption.
The text was updated successfully, but these errors were encountered: