Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: drop_params=True not working for dimensions parameter with OpenAI-compatible endpoints #6516

Closed
NolanTrem opened this issue Oct 30, 2024 · 5 comments
Labels
bug Something isn't working

Comments

@NolanTrem
Copy link
Contributor

What happened?

When using OpenAI-compatible endpoints, setting litellm.drop_params = True does not prevent the dimensions parameter from being passed to the API, resulting in an UnsupportedParamsError. Would love to get this fixed so that we could better support LM Studio in R2R.

Example output when running without/with dimensions set:
Screenshot 2024-10-30 at 11 39 56 AM

And the code to replicate:

import litellm
import logging
from typing import Dict, Any
import json

logging.basicConfig(
    level=logging.INFO,
    format='%(message)s'  # Simplified logging format
)
logger = logging.getLogger(__name__)

def test_embedding(test_name: str, params: Dict[str, Any]) -> None:
    print(f"\n=== Test: {test_name} ===")
    print(f"Parameters: {json.dumps(params, indent=2)}")
    print(f"litellm.drop_params is set to: {litellm.drop_params}")
    
    try:
        response = litellm.embedding(**params)
        print("\nResponse received successfully!")
        
        if response and hasattr(response, 'data') and response.data:
            embeddings = [d['embedding'] for d in response.data if 'embedding' in d]
            if embeddings:
                print(f"Success! First embedding length: {len(embeddings[0])}")
            else:
                print("No embeddings found in response data")
            
    except Exception as e:
        print(f"\nError occurred:")
        print(f"Error message: {str(e)}")
        print(f"Error type: {type(e)}")

def main():
    # Base parameters
    base_params = {
        "model": "openai/text-embedding-nomic-embed-text-v1.5-embedding",
        "api_base": "http://localhost:1234/v1",
        "api_key": "fake-api-key-123",
        "input": ["This is a test sentence for embeddings."]
    }
    
    # Set drop_params to True as recommended in the error message
    litellm.drop_params = True
    print(f"Setting litellm.drop_params = {litellm.drop_params}")
    
    # Test 1: Without dimensions parameter
    test_embedding("Without dimensions parameter", base_params.copy())
    
    # Test 2: With dimensions parameter
    params_with_dimensions = base_params.copy()
    params_with_dimensions["dimensions"] = 768
    test_embedding("With dimensions parameter", params_with_dimensions)

if __name__ == "__main__":
    main()

Relevant log output

Error message: litellm.UnsupportedParamsError: Setting dimensions is not supported for OpenAI `text-embedding-3` and later models. To drop it from the call, set `litellm.drop_params = True`.
Error type: <class 'litellm.exceptions.UnsupportedParamsError'>

Twitter / LinkedIn details

No response

@NolanTrem NolanTrem added the bug Something isn't working label Oct 30, 2024
@krrishdholakia
Copy link
Contributor

that's not a bug though - since dimensions is an openai param, i would expect it to be passed along

@krrishdholakia
Copy link
Contributor

you can work around this today with this - https://docs.litellm.ai/docs/completion/drop_params#specify-params-to-drop

but i guess the real ask is to have native support for LM Studio. So, closing in favor of #3755

@NolanTrem
Copy link
Contributor Author

NolanTrem commented Oct 31, 2024

I see. This gets a little messy in our implementation then, since we try to just have a generic litellm class that routes the requests. It feels a bit tacky to conditionally check if it is truly an OpenAI request (which we do use, and in many cases specify the dimensions) vs. an OpenAI-like request...

For reference

Is native support on the near-term roadmap?

@krrishdholakia
Copy link
Contributor

that's reasonable - yes we can add it this weekend.

For future integrations (other openai-like api's that might need some param add/drop logic) what would the ideal flow be?

@krrishdholakia krrishdholakia closed this as not planned Won't fix, can't repro, duplicate, stale Nov 1, 2024
@krrishdholakia
Copy link
Contributor

Closing in favor of #3755

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants