-
Notifications
You must be signed in to change notification settings - Fork 5.3k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Add to sidebar and consisten with the rest of providers
- Loading branch information
Showing
2 changed files
with
12 additions
and
42 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,55 +1,20 @@ | ||
|
||
# LiteLLM Proxy | ||
|
||
OpenHands supports using the [LiteLLM proxy](https://docs.litellm.ai/docs/proxy/quick_start) to access various LLM providers. This is particularly useful when you want to: | ||
|
||
- Use a single interface to access multiple LLM providers | ||
- Add authentication, rate limiting, and other features to your LLM API | ||
- Route requests through a proxy for security or networking requirements | ||
OpenHands supports using the [LiteLLM proxy](https://docs.litellm.ai/docs/proxy/quick_start) to access various LLM providers. | ||
|
||
## Configuration | ||
|
||
To use LiteLLM proxy with OpenHands, you need to: | ||
|
||
1. Set up a LiteLLM proxy server (see [LiteLLM documentation](https://docs.litellm.ai/docs/proxy/quick_start)) | ||
2. Configure OpenHands to use the proxy | ||
|
||
Here's an example configuration: | ||
|
||
```toml | ||
[llm] | ||
# Important: Use `litellm_proxy/` instead of `openai/` | ||
model = "litellm_proxy/anthropic.claude-3-5-sonnet-20241022-v2:0" # The model name as configured in your LiteLLM proxy | ||
base_url = "https://your-litellm-proxy.com" # Your LiteLLM proxy URL | ||
api_key = "your-api-key" # API key for authentication with the proxy | ||
``` | ||
|
||
:::caution | ||
When using LiteLLM proxy, make sure to use the `litellm_proxy` provider instead of `openai`. Using `openai` as the provider may cause compatibility issues with certain LLM providers like Bedrock. | ||
::: | ||
|
||
## Example Usage | ||
|
||
Here's how to use LiteLLM proxy in your OpenHands configuration: | ||
|
||
```toml | ||
[llm] | ||
model = "litellm_proxy/anthropic.claude-3-5-sonnet-20241022-v2:0" | ||
base_url = "https://proxy.example.com" | ||
api_key = "your-api-key" | ||
temperature = 0.0 | ||
top_p = 1.0 | ||
|
||
``` | ||
2. When running OpenHands, you'll need to set the following in the OpenHands UI through the Settings: | ||
* Enable `Advanced Options` | ||
* `Custom Model` to the prefix `litellm_proxy/` + the model you will be using (e.g. `litellm_proxy/anthropic.claude-3-5-sonnet-20241022-v2:0`) | ||
* `Base URL` to your LiteLLM proxy URL (e.g. `https://your-litellm-proxy.com`) | ||
* `API Key` to your LiteLLM proxy API key | ||
|
||
## Supported Models | ||
|
||
The supported models depend on your LiteLLM proxy configuration. OpenHands supports any model that your LiteLLM proxy is configured to handle, including: | ||
|
||
- OpenAI models | ||
- Anthropic Claude models | ||
- AWS Bedrock models | ||
- Azure OpenAI models | ||
- And more | ||
The supported models depend on your LiteLLM proxy configuration. OpenHands supports any model that your LiteLLM proxy is configured to handle. | ||
|
||
Refer to your LiteLLM proxy configuration for the list of available models and their names. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters