Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add usage statistics for inference API #894

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open

Conversation

dineshyv
Copy link
Contributor

What does this PR do?

Adds a new usage field to inference APIs to indicate token counts for prompt and completion.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Meta Open Source bot. label Jan 28, 2025
@dineshyv dineshyv changed the title add usage statistics for chat completion non streaming add usage statistics for inference API Jan 28, 2025
@vladimirivic
Copy link
Contributor

@dineshyv I think you also want to regenerate the Open API spec so this get's updated:
https://github.com/meta-llama/llama-stack/blob/main/docs/resources/llama-stack-spec.yaml#L1965-L1988

I think you just need to run docs/openapi_generator/run_openapi_generator.sh

Copy link
Contributor

@vladimirivic vladimirivic left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, approving to unblock, see my comment about the spec update.

@ashwinb
Copy link
Contributor

ashwinb commented Jan 29, 2025

@raghotham any comments on generality / future-facing stuff?

@dineshyv hold on for a couple hours before merging.

Copy link
Contributor

@ashwinb ashwinb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This needs more discussion. Here is an alternate idea:

  • we think of this as an extension of our telemetry API -- or rather telemetry needs to just work for this too.
  • from that standpoint, we can make sure these stats get logged as "metrics" into our telemetry API and therefore can be queried
  • this of course works only if the distro has a telemetry provider

however this is not sufficient because it makes for a terrible dev-ex for debugging while developing. when I want to understand usage I want each API calls usage to also be available. so how about all Llama Stack API calls are augmented by metrics?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Meta Open Source bot.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants