Skip to content

[Bug]: drop_params=True not working for dimensions parameter with OpenAI-compatible endpoints #6516

Closed as not planned
@NolanTrem

Description

@NolanTrem

What happened?

When using OpenAI-compatible endpoints, setting litellm.drop_params = True does not prevent the dimensions parameter from being passed to the API, resulting in an UnsupportedParamsError. Would love to get this fixed so that we could better support LM Studio in R2R.

Example output when running without/with dimensions set:
Screenshot 2024-10-30 at 11 39 56 AM

And the code to replicate:

import litellm
import logging
from typing import Dict, Any
import json

logging.basicConfig(
    level=logging.INFO,
    format='%(message)s'  # Simplified logging format
)
logger = logging.getLogger(__name__)

def test_embedding(test_name: str, params: Dict[str, Any]) -> None:
    print(f"\n=== Test: {test_name} ===")
    print(f"Parameters: {json.dumps(params, indent=2)}")
    print(f"litellm.drop_params is set to: {litellm.drop_params}")
    
    try:
        response = litellm.embedding(**params)
        print("\nResponse received successfully!")
        
        if response and hasattr(response, 'data') and response.data:
            embeddings = [d['embedding'] for d in response.data if 'embedding' in d]
            if embeddings:
                print(f"Success! First embedding length: {len(embeddings[0])}")
            else:
                print("No embeddings found in response data")
            
    except Exception as e:
        print(f"\nError occurred:")
        print(f"Error message: {str(e)}")
        print(f"Error type: {type(e)}")

def main():
    # Base parameters
    base_params = {
        "model": "openai/text-embedding-nomic-embed-text-v1.5-embedding",
        "api_base": "http://localhost:1234/v1",
        "api_key": "fake-api-key-123",
        "input": ["This is a test sentence for embeddings."]
    }
    
    # Set drop_params to True as recommended in the error message
    litellm.drop_params = True
    print(f"Setting litellm.drop_params = {litellm.drop_params}")
    
    # Test 1: Without dimensions parameter
    test_embedding("Without dimensions parameter", base_params.copy())
    
    # Test 2: With dimensions parameter
    params_with_dimensions = base_params.copy()
    params_with_dimensions["dimensions"] = 768
    test_embedding("With dimensions parameter", params_with_dimensions)

if __name__ == "__main__":
    main()

Relevant log output

Error message: litellm.UnsupportedParamsError: Setting dimensions is not supported for OpenAI `text-embedding-3` and later models. To drop it from the call, set `litellm.drop_params = True`.
Error type: <class 'litellm.exceptions.UnsupportedParamsError'>

Twitter / LinkedIn details

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions