forked from All-Hands-AI/OpenHands
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
documentation changes associated with UI changes and more consistency (…
- Loading branch information
Showing
10 changed files
with
59 additions
and
95 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,28 +1,30 @@ | ||
# Google Gemini/Vertex LLM | ||
|
||
## Completion | ||
|
||
OpenHands uses LiteLLM for completion calls. The following resources are relevant for using OpenHands with Google's LLMs: | ||
|
||
- [Gemini - Google AI Studio](https://docs.litellm.ai/docs/providers/gemini) | ||
- [VertexAI - Google Cloud Platform](https://docs.litellm.ai/docs/providers/vertex) | ||
|
||
### Gemini - Google AI Studio Configs | ||
## Gemini - Google AI Studio Configs | ||
|
||
To use Gemini through Google AI Studio when running the OpenHands Docker image, you'll need to set the following environment variables using `-e`: | ||
When running OpenHands, you'll need to set the following in the OpenHands UI through the Settings: | ||
* `LLM Provider` to `Gemini` | ||
* `LLM Model` to the model you will be using. | ||
If the model is not in the list, toggle `Advanced Options`, and enter it in `Custom Model` (i.e. gemini/<model-name>). | ||
* `API Key` | ||
|
||
``` | ||
GEMINI_API_KEY="<your-google-api-key>" | ||
LLM_MODEL="gemini/gemini-1.5-pro" | ||
``` | ||
## VertexAI - Google Cloud Platform Configs | ||
|
||
### Vertex AI - Google Cloud Platform Configs | ||
|
||
To use Vertex AI through Google Cloud Platform when running the OpenHands Docker image, you'll need to set the following environment variables using `-e`: | ||
To use Vertex AI through Google Cloud Platform when running OpenHands, you'll need to set the following environment | ||
variables using `-e` in the [docker run command](/modules/usage/getting-started#installation): | ||
|
||
``` | ||
GOOGLE_APPLICATION_CREDENTIALS="<json-dump-of-gcp-service-account-json>" | ||
VERTEXAI_PROJECT="<your-gcp-project-id>" | ||
VERTEXAI_LOCATION="<your-gcp-location>" | ||
LLM_MODEL="vertex_ai/<desired-llm-model>" | ||
``` | ||
|
||
Then set the following in the OpenHands UI through the Settings: | ||
* `LLM Provider` to `VertexAI` | ||
* `LLM Model` to the model you will be using. | ||
If the model is not in the list, toggle `Advanced Options`, and enter it in `Custom Model` (i.e. vertex_ai/<model-name>). |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,23 +1,16 @@ | ||
# OpenAI | ||
|
||
OpenHands uses [LiteLLM](https://www.litellm.ai/) to make calls to OpenAI's chat models. You can find their full documentation on OpenAI chat calls [here](https://docs.litellm.ai/docs/providers/openai). | ||
OpenHands uses LiteLLM to make calls to OpenAI's chat models. You can find their full documentation on OpenAI chat calls [here](https://docs.litellm.ai/docs/providers/openai). | ||
|
||
## Configuration | ||
|
||
When running the OpenHands Docker image, you'll need to choose a model and set your API key in the OpenHands UI through the Settings. | ||
|
||
To see a full list of OpenAI models that LiteLLM supports, please visit https://docs.litellm.ai/docs/providers/openai#openai-chat-completion-models. | ||
|
||
To find or create your OpenAI Project API Key, please visit https://platform.openai.com/api-keys. | ||
When running OpenHands, you'll need to set the following in the OpenHands UI through the Settings: | ||
* `LLM Provider` to `OpenAI` | ||
* `LLM Model` to the model you will be using. | ||
[Visit **here** to see a full list of OpenAI models that LiteLLM supports.](https://docs.litellm.ai/docs/providers/openai#openai-chat-completion-models) | ||
If the model is not in the list, toggle `Advanced Options`, and enter it in `Custom Model` (i.e. openai/<model-name>). | ||
* `API Key`. To find or create your OpenAI Project API Key, [see **here**](https://platform.openai.com/api-keys). | ||
|
||
## Using OpenAI-Compatible Endpoints | ||
|
||
Just as for OpenAI Chat completions, we use LiteLLM for OpenAI-compatible endpoints. You can find their full documentation on this topic [here](https://docs.litellm.ai/docs/providers/openai_compatible). | ||
|
||
When running the OpenHands Docker image, you'll need to set the following environment variables using `-e`: | ||
|
||
```sh | ||
LLM_BASE_URL="<api-base-url>" # e.g. "http://0.0.0.0:3000" | ||
``` | ||
|
||
Then set your model and API key in the OpenHands UI through the Settings. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.