Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Track model metadata #481

Closed
SmittieC opened this issue Jun 26, 2024 · 4 comments · Fixed by #810
Closed

Track model metadata #481

SmittieC opened this issue Jun 26, 2024 · 4 comments · Fixed by #810
Assignees
Labels
enhancement New feature or request

Comments

@SmittieC
Copy link
Collaborator

SmittieC commented Jun 26, 2024

It can really help users out if OCS knows a few things about selected models so that we can build guardrails that will ultimately help lower frustration and make the platform for robust. See this thread as an example where it would have been useful if OCS had known what the model's token limit is.

Model metadata to track

  • Token limit
  • Rates / cost
  • Support for function calling

Currently we store model token limits on the Experiment or Pipeline Node but the values are specific to the LLM model so should be stored along with the model name.

We must also allow users to create new 'models' via the UI e.g. fine tuned models.

The token limits should then be accessed through the LlmService objects.

@SmittieC SmittieC changed the title OCS to have knowledge of model limits OCS to have knowledge of model metadata Jun 28, 2024
@SmittieC
Copy link
Collaborator Author

If we know the token limit for a specific model, we can

  • Do do input token count and disallow users to input messages larger than that which the model can handle (on webusers though)
  • Do proper limiting and/or estimation for what the max token limit should be. Currently users can set this to any number, regardless of the model's context limit.

@SmittieC SmittieC changed the title OCS to have knowledge of model metadata Track model metadata Jun 28, 2024
@SmittieC SmittieC added the enhancement New feature or request label Jun 28, 2024
@SmittieC
Copy link
Collaborator Author

Some thoughts on this can be found here cc @snopoke @stephherbers @bderenzi

@SmittieC SmittieC self-assigned this Jul 16, 2024
@snopoke snopoke assigned proteusvacuum and unassigned SmittieC Oct 17, 2024
@SmittieC
Copy link
Collaborator Author

Langfuse has a nice way of doing it. Under the Tracing tab -> Models. This might serve as inspiration

@marcklingen
Copy link

Langfuse has a nice way of doing it. Under the Tracing tab -> Models. This might serve as inspiration

fyi, this is currently changing in Langfuse as more detailed model price details are necessary (cached/video/audio tokens). will probably be released next week

@proteusvacuum proteusvacuum moved this to 🏗 In progress in OpenChatStudio Oct 29, 2024
@proteusvacuum proteusvacuum linked a pull request Oct 31, 2024 that will close this issue
@github-project-automation github-project-automation bot moved this from 🏗 In progress to ✅ Done in OpenChatStudio Nov 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
Status: ✅ Done
Development

Successfully merging a pull request may close this issue.

3 participants