-
-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: drop_params=True not working for dimensions parameter with OpenAI-compatible endpoints #6516
Comments
that's not a bug though - since |
you can work around this today with this - https://docs.litellm.ai/docs/completion/drop_params#specify-params-to-drop but i guess the real ask is to have native support for LM Studio. So, closing in favor of #3755 |
I see. This gets a little messy in our implementation then, since we try to just have a generic litellm class that routes the requests. It feels a bit tacky to conditionally check if it is truly an OpenAI request (which we do use, and in many cases specify the dimensions) vs. an OpenAI-like request... Is native support on the near-term roadmap? |
that's reasonable - yes we can add it this weekend. For future integrations (other openai-like api's that might need some param add/drop logic) what would the ideal flow be? |
Closing in favor of #3755 |
What happened?
When using OpenAI-compatible endpoints, setting
litellm.drop_params = True
does not prevent thedimensions
parameter from being passed to the API, resulting in an UnsupportedParamsError. Would love to get this fixed so that we could better support LM Studio in R2R.Example output when running without/with dimensions set:
And the code to replicate:
Relevant log output
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: