Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: litellm conflict with openai on lib override #480

Closed
wants to merge 2 commits into from

Conversation

teocns
Copy link
Contributor

@teocns teocns commented Oct 31, 2024

Fixes #471

  • Removed original_oai_create and original_oai_create_async variables since we no longer override OpenAI's methods
  • Modified override() and undo_override() to only handle LiteLLM's methods
  • Updated _override_completion() and _override_async_completion() to only store and patch LiteLLM's methods

This way, when both providers are used:

  • OpenAIProvider will handle overriding OpenAI's completion methods
  • LiteLLMProvider will only handle overriding LiteLLM's completion methods
  • No more conflicts between the two providers ✅

A bit more of explanation

LiteLLM's completion method is completely separate from OpenAI's Completions.create
Even though LiteLLM uses OpenAI's format internally, it has its own implementation
When we call litellm.completion(), it doesn't actually call OpenAI's Completions.create

Tests

Breakdown
  • test_provider_override_independence():
    1. Verifies that each provider only overrides its own methods
    2. Checks that the original methods are properly stored and restored
    3. Ensures no cross-contamination between providers
  • test_provider_override_order_independence():
    1. Verifies that the order of provider initialization doesn't matter
    2. Tests both possible orders (OpenAI first vs LiteLLM first)
    3. Ensures each provider's overrides remain intact regardless of order
What's being verified
  • Each provider only overrides its intended methods
  • Original methods are properly stored and can be restored
  • The order of provider initialization doesn't affect the overrides
  • No provider interferes with another provider's overrides
python tests/test_providers.py

Removed original_oai_create and original_oai_create_async variables
since we no longer override OpenAI's methods
Modified override() and undo_override() to only handle LiteLLM's methods
Updated _override_completion() and _override_async_completion() to only
store and patch LiteLLM's methods
This way, when both providers are used:
OpenAIProvider will handle overriding OpenAI's completion methods
LiteLLMProvider will only handle overriding LiteLLM's completion methods
No more conflicts between the two providers
@areibman
Copy link
Contributor

areibman commented Nov 5, 2024

I ran this in python and it didn't work for OpenAI

import litellm
import openai
import agentops

agentops.init(default_tags=["litellm"])

message = [{"role": "user", "content": "Write a 12 word poem about secret agents."}]

litellm.completion(model="claude-3-sonnet-20240229", messages=message, temperature=0)

client = openai.OpenAI()

client.chat.completions.create(model="gpt-4o", messages=message, temperature=0)```

@areibman
Copy link
Contributor

areibman commented Nov 5, 2024

@the-praxs Can you give it a look?

@the-praxs
Copy link
Member

@the-praxs Can you give it a look?

Testing it - same issue as yesterday - only the first create uses the patched function.

I am trying to troubleshoot the root cause.

@teocns
Copy link
Contributor Author

teocns commented Nov 18, 2024

closing in favor of #476

@teocns teocns closed this Nov 18, 2024
@areibman
Copy link
Contributor

If we want to avoid client conflicts, perhaps we can instead use their official observability documentation https://docs.litellm.ai/docs/observability/arize_integration

@areibman areibman reopened this Nov 18, 2024
@teocns
Copy link
Contributor Author

teocns commented Nov 21, 2024

Since the other PR was closed by the author will consider this as "Resolves #471 "

@teocns teocns marked this pull request as draft December 9, 2024 16:51
@teocns teocns closed this Dec 9, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Bug]: LiteLLM conflicts with OpenAI
3 participants