Skip to content

Commit

Permalink
Add to sidebar and consisten with the rest of providers
Browse files Browse the repository at this point in the history
  • Loading branch information
mamoodi committed Nov 12, 2024
1 parent 88e42d3 commit b08b067
Show file tree
Hide file tree
Showing 2 changed files with 12 additions and 42 deletions.
49 changes: 7 additions & 42 deletions docs/modules/usage/llms/litellm-proxy.md
Original file line number Diff line number Diff line change
@@ -1,55 +1,20 @@

# LiteLLM Proxy

OpenHands supports using the [LiteLLM proxy](https://docs.litellm.ai/docs/proxy/quick_start) to access various LLM providers. This is particularly useful when you want to:

- Use a single interface to access multiple LLM providers
- Add authentication, rate limiting, and other features to your LLM API
- Route requests through a proxy for security or networking requirements
OpenHands supports using the [LiteLLM proxy](https://docs.litellm.ai/docs/proxy/quick_start) to access various LLM providers.

## Configuration

To use LiteLLM proxy with OpenHands, you need to:

1. Set up a LiteLLM proxy server (see [LiteLLM documentation](https://docs.litellm.ai/docs/proxy/quick_start))
2. Configure OpenHands to use the proxy

Here's an example configuration:

```toml
[llm]
# Important: Use `litellm_proxy/` instead of `openai/`
model = "litellm_proxy/anthropic.claude-3-5-sonnet-20241022-v2:0" # The model name as configured in your LiteLLM proxy
base_url = "https://your-litellm-proxy.com" # Your LiteLLM proxy URL
api_key = "your-api-key" # API key for authentication with the proxy
```

:::caution
When using LiteLLM proxy, make sure to use the `litellm_proxy` provider instead of `openai`. Using `openai` as the provider may cause compatibility issues with certain LLM providers like Bedrock.
:::

## Example Usage

Here's how to use LiteLLM proxy in your OpenHands configuration:

```toml
[llm]
model = "litellm_proxy/anthropic.claude-3-5-sonnet-20241022-v2:0"
base_url = "https://proxy.example.com"
api_key = "your-api-key"
temperature = 0.0
top_p = 1.0

```
2. When running OpenHands, you'll need to set the following in the OpenHands UI through the Settings:
* Enable `Advanced Options`
* `Custom Model` to the prefix `litellm_proxy/` + the model you will be using (e.g. `litellm_proxy/anthropic.claude-3-5-sonnet-20241022-v2:0`)
* `Base URL` to your LiteLLM proxy URL (e.g. `https://your-litellm-proxy.com`)
* `API Key` to your LiteLLM proxy API key

## Supported Models

The supported models depend on your LiteLLM proxy configuration. OpenHands supports any model that your LiteLLM proxy is configured to handle, including:

- OpenAI models
- Anthropic Claude models
- AWS Bedrock models
- Azure OpenAI models
- And more
The supported models depend on your LiteLLM proxy configuration. OpenHands supports any model that your LiteLLM proxy is configured to handle.

Refer to your LiteLLM proxy configuration for the list of available models and their names.
5 changes: 5 additions & 0 deletions docs/sidebars.ts
Original file line number Diff line number Diff line change
Expand Up @@ -76,6 +76,11 @@ const sidebars: SidebarsConfig = {
label: 'Groq',
id: 'usage/llms/groq',
},
{
type: 'doc',
label: 'LiteLLM Proxy',
id: 'usage/llms/litellm-proxy',
},
{
type: 'doc',
label: 'OpenAI',
Expand Down

0 comments on commit b08b067

Please sign in to comment.