Skip to content

Partial fixes for FIM with OpenRouter #990

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
Feb 10, 2025
Merged

Conversation

jhrozek
Copy link
Contributor

@jhrozek jhrozek commented Feb 7, 2025

These fixes are partial because for some reason Continue still doesn't like
the chunks we receive.

  • Add openrouter integration tests - What it says on the tin
  • Workaround for litellm using a strict OpenAI provider for OpenRouter - The problem is that litellm uses the OpenAI provider for talking to OpenRouter, but OpenRouter uses an OpenAI dialect - for example, the OpenAI python API no longer allows you to POST payloads that contain post, but those are often used with OpenRouter for FIM. The effect is that FIM payloads from Continue are rejected by litellm. To work around that, we add a FIM normalizer for OpenRouter that moves prompt to messages, like we often do during normalization, but in this normalizer we don't de-normalize but isntead pass on the payload with messages to the completion.
  • Listen on /openrouter/completions for FIM - Continue sends FIM to openrouter to /completions, let's add that route

Related: #980

@jhrozek jhrozek force-pushed the continue_openrouter_fim branch from 4566e3e to e8610e4 Compare February 7, 2025 18:30
@jhrozek
Copy link
Contributor Author

jhrozek commented Feb 7, 2025

let's merge #983 first though

A new provider need tests! Let's not be sloppy.
The problem is that litellm uses the OpenAI provider for talking to
OpenRouter, but OpenRouter uses an OpenAI dialect - for example, the
OpenAI python API no longer allows you to POST payloads that contain
`post`, but those are often used with OpenRouter for FIM. The effect is
that FIM payloads from Continue are rejected by litellm.

To work around that, we add a FIM normalizer for OpenRouter that moves
prompt to messages, like we often do during normalization, but in this
normalizer we don't de-normalize but isntead pass on the payload with
`messages` to the completion.
Continue sends FIM to openrouter to /completions, let's add that route
@jhrozek jhrozek force-pushed the continue_openrouter_fim branch from e8610e4 to f35276e Compare February 10, 2025 08:36
@rdimitrov rdimitrov merged commit 0753bd6 into main Feb 10, 2025
9 checks passed
@rdimitrov rdimitrov deleted the continue_openrouter_fim branch February 10, 2025 10:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants