Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Docs for v0.1.17 #75

Merged
merged 7 commits into from
Feb 10, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
19 changes: 17 additions & 2 deletions docs/about/changelog.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,12 +13,27 @@ Major features and changes are noted here. To review all updates, see the

Related: [Upgrade CodeGate](../how-to/install.md#upgrade-codegate)

- **New integration: Open Interpreter** - xx Feb\
2025 CodeGate v0.1.16 introduces support for
- **Model muxing** - 7 Feb, 2025\
With CodeGate v0.1.17 you can use the new `/v1/mux` endpoint to configure
model selection based on your workspace! Learn more in the
[model muxing guide](../features/muxing.md).

- **OpenRouter endpoint** - 7 Feb, 2025\
CodeGate v0.1.17 adds a dedicated `/openrouter` provider endpoint for
OpenRouter users. This endpoint currently works with Continue, Cline, and Kodu
(Claude Coder).

- **New integration: Open Interpreter** - 4 Feb, 2025\
CodeGate v0.1.16 added support for
[Open Interpreter](https://github.com/openinterpreter/open-interpreter) with
OpenAI-compatible APIs. Review the
[integration guide](../integrations/open-interpreter.mdx) to get started.

- **New integration: Claude Coder** - 28 Jan, 2025\
CodeGate v0.1.14 also introduced support for Kodu's
[Claude Coder](https://www.kodu.ai/extension) extension. See the
[integration guide](../integrations/kodu.mdx) to learn more.

- **New integration: Cline** - 28 Jan, 2025\
CodeGate version 0.1.14 adds support for [Cline](https://cline.bot/) with
Anthropic, OpenAI, Ollama, and LM Studio. See the
Expand Down
129 changes: 129 additions & 0 deletions docs/features/muxing.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,129 @@
---
title: Model muxing
description: Configure a per-workspace LLM
sidebar_position: 35
---

## Overview

_Model muxing_ (or multiplexing), allows you to configure your AI assistant once
and use [CodeGate workspaces](./workspaces.mdx) to switch between LLM providers
and models without reconfiguring your development environment. This feature is
especially useful when you're working on multiple projects or tasks that require
different AI models.

For each CodeGate workspace, you can select the AI provider and model
combination you want to use. Then, configure your AI coding tool to use the
CodeGate muxing endpoint `http://localhost:8989/v1/mux` as an OpenAI-compatible
API provider.

To change the model currently in use, simply switch your active CodeGate
workspace.

```mermaid
flowchart LR
Client(AI Assistant/Agent)
CodeGate{CodeGate}
WS1[Workspace-A]
WS2[Workspace-B]
WS3[Workspace-C]
LLM1(OpenAI/<br>o3-mini)
LLM2(Ollama/<br>deepseek-r1)
LLM3(OpenRouter/<br>claude-35-sonnet)

Client ---|/v1/mux| CodeGate
CodeGate --> WS1
CodeGate --> WS2
CodeGate --> WS3
WS1 --> |api| LLM1
WS2 --> |api| LLM2
WS3 --> |api| LLM3
```

## Use cases

- You have a project that requires a specific model for a particular task, but
you also need to switch between different models during the course of your
work.
- You want to experiment with different LLM providers and models without having
to reconfigure your AI assistant/agent every time you switch.
- Your AI coding assistant doesn't support a particular provider or model that
you want to use. CodeGate's muxing provides an OpenAI-compatible abstraction
layer.
- You're working on a sensitive project and want to use a local model, but still
have the flexibility to switch to hosted models for other work.
- You want to control your LLM provider spend by using lower-cost models for
some tasks that don't require the power of more advanced (and expensive)
reasoning models.

## Configure muxing

To use muxing with your AI coding assistant, you need to add one or more AI
providers to CodeGate, then select the model you want to use on a workspace.

CodeGate supports the following LLM providers for muxing:

- Anthropic
- llama.cpp
- LM Studio
- Ollama
- OpenAI (and compatible APIs)
- OpenRouter
- vLLM

### Add a provider

1. In the [CodeGate dashboard](http://localhost:9090), open the **Providers**
page from the **Settings** menu.
1. Click **Add Provider**.
1. Enter a display name for the provider, then select the type from the
drop-down list. The default endpoint and authentication type are filled in
automatically.
1. If you are using a non-default endpoint, update the **Endpoint** value.
1. Optionally, add a **Description** for the provider.
1. If the provider requires authentication, select the **API Key**
authentication option and enter your key.

When you save the settings, CodeGate connects to the provider to retrieve the
available models.

:::note

For locally-hosted models, you must use `http://host.docker.internal` instead of
`http://localhost`

:::

### Select the model for a workspace

Open the settings of one of your [workspaces](./workspaces.mdx) from the
Workspace selection menu or the
[Manage Workspaces](http://localhost:9090/workspaces) screen.

In the **Preferred Model** section, select the model to use with the workspace.

### Manage existing providers

To edit a provider's settings, click the Manage button next to the provider in
the list. For providers that require authentication, you can leave the API key
field blank to preserve the current value.

To delete a provider, click the trash icon next to it. If this provider was in
use by any workspaces, you will need to update their settings to choose a
different provider/model.

### Refresh available models

To refresh the list of models available from a provider, in the Providers list,
click the Manage button next to the provider to refresh, then save it without
making any changes.

## Configure your client

Configure the OpenAI-compatible API base URL of your AI coding assistant/agent
to `http://localhost:8989/v1/mux`. If your client requires a model name and/or
API key, you can enter any values since CodeGate manages the model selection and
authentication.

For specific instructions, see the
[integration guide](../integrations/index.mdx) for your client.
13 changes: 9 additions & 4 deletions docs/features/workspaces.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -25,9 +25,13 @@ Workspaces offer several key features:

- **Custom instructions**: Customize your interactions with LLMs by augmenting
your AI assistant's system prompt, enabling tailored responses and behaviors
for different types of tasks. CodeGate includes a library of community prompts
that can be easily customized for specific tasks. You can also create your
own.
for different types of tasks. Choose from CodeGate's library of community
prompts or create your own.

- [**Model muxing**](./muxing.md): Configure the LLM provider/model for each
workspace, allowing you to configure your AI assistant/agent once and switch
between different models on the fly. This is useful when working on multiple
projects or tasks that require different AI models.

- **Prompt and alert history**: Your LLM interactions (prompt history) and
CodeGate security detections (alert history) are recorded in the active
Expand Down Expand Up @@ -112,7 +116,8 @@ In the workspace list, open the menu (**...**) next to a workspace to
**Activate**, **Edit**, or **Archive** the workspace.

**Edit** opens the workspace settings page. From here you can rename the
workspace, set the custom prompt instructions, or archive the workspace.
workspace, select the LLM provider and model (see [Model muxing](./muxing.md)),
set the custom prompt instructions, or archive the workspace.

**Archived** workspaces can be restored or permanently deleted from the
workspace list or workspace settings screen.
Expand Down
26 changes: 9 additions & 17 deletions docs/how-to/configure.md
Original file line number Diff line number Diff line change
@@ -1,14 +1,15 @@
---
title: Configure CodeGate
title: Advanced configuration
description: Customizing CodeGate's application settings
sidebar_position: 20
sidebar_position: 30
---

## Customize CodeGate's behavior

The CodeGate container runs with default settings to support Ollama, Anthropic,
and OpenAI APIs with typical settings. To customize the behavior, you can add
extra configuration parameters to the container as environment variables:
The CodeGate container runs with defaults that work with supported LLM providers
using typical settings. To customize CodeGate's application settings like
provider endpoints and logging level, you can add extra configuration parameters
to the container as environment variables:

```bash {2}
docker run --name codegate -d -p 8989:8989 -p 9090:9090 \
Expand All @@ -31,22 +32,13 @@ CodeGate supports the following parameters:
| `CODEGATE_OPENAI_URL` | `https://api.openai.com/v1` | Specifies the OpenAI engine API endpoint URL. |
| `CODEGATE_VLLM_URL` | `http://localhost:8000` | Specifies the URL of the vLLM server to use. |

## Example: Use CodeGate with OpenRouter
## Example: Use CodeGate with a remote Ollama server

[OpenRouter](https://openrouter.ai/) is an interface to many large language
models. CodeGate's vLLM provider works with OpenRouter's API when used with the
Continue IDE plugin.

To use OpenRouter, set the vLLM URL when you launch CodeGate:
Set the Ollama server's URL when you launch CodeGate:

```bash {2}
docker run --name codegate -d -p 8989:8989 -p 9090:9090 \
-e CODEGATE_VLLM_URL=https://openrouter.ai/api \
-e CODEGATE_OLLAMA_URL=https://my.ollama-server.example \
--mount type=volume,src=codegate_volume,dst=/app/codegate_volume \
--restart unless-stopped ghcr.io/stacklok/codegate
```

Then,
[configure the Continue IDE plugin](../integrations/continue.mdx?provider=vllm)
to use CodeGate's vLLM endpoint (`http://localhost:8989/vllm`) along with the
model you'd like to use and your OpenRouter API key.
2 changes: 1 addition & 1 deletion docs/how-to/dashboard.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
---
title: Access the dashboard
description: View alerts and usage history
sidebar_position: 30
sidebar_position: 20
---

## Enable dashboard access
Expand Down
23 changes: 20 additions & 3 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,20 @@ sequenceDiagram
deactivate CodeGate
```

## Key features

CodeGate includes several key features for privacy, security, and coding
efficiency, including:

- [Secrets encryption](./features/secrets-encryption.md) to protect your
sensitive credentials
- [Dependency risk awareness](./features/dependency-risk.md) to update the LLM's
knowledge of malicious or deprecated open source packages
- [Model muxing](./features/muxing.md) to quickly select the best LLM
provider/model for your current task
- [Workspaces](./features/workspaces.mdx) to organize and customize your LLM
interactions

## Supported environments

CodeGate supports several development environments and AI providers.
Expand All @@ -41,20 +55,23 @@ AI coding assistants / IDEs:

- **[Cline](./integrations/cline.mdx)** in Visual Studio Code

CodeGate supports Ollama, Anthropic, OpenAI-compatible APIs, and LM Studio
with Cline
CodeGate supports Ollama, Anthropic, OpenAI and compatible APIs, OpenRouter,
and LM Studio with Cline

- **[Continue](./integrations/continue.mdx)** with Visual Studio Code and
JetBrains IDEs

CodeGate supports the following AI model providers with Continue:

- Local / self-managed: Ollama, llama.cpp, vLLM
- Hosted: Anthropic, OpenAI and OpenAI-compatible APIs like OpenRouter
- Hosted: Anthropic, OpenAI and compatible APIs, and OpenRouter

- **[GitHub Copilot](./integrations/copilot.mdx)** with Visual Studio Code
(JetBrains coming soon!)

- **[Kodu / Claude Coder](./integrations/kodu.mdx)** in Visual Studio Code with
OpenAI-compatible APIs

- **[Open Interpreter](./integrations/open-interpreter.mdx)** with
OpenAI-compatible APIs

Expand Down
3 changes: 3 additions & 0 deletions docs/integrations/aider.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,9 @@ CodeGate works with the following AI model providers through aider:
- Hosted:
- [OpenAI](https://openai.com/api/) and OpenAI-compatible APIs

You can also configure [CodeGate muxing](../features/muxing.md) to select your
provider and model using [workspaces](../features/workspaces.mdx).

:::note

This guide assumes you have already installed aider using their
Expand Down
41 changes: 37 additions & 4 deletions docs/integrations/cline.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,11 @@ CodeGate works with the following AI model providers through Cline:
- [LM Studio](https://lmstudio.ai/)
- Hosted:
- [Anthropic](https://www.anthropic.com/api)
- [OpenAI](https://openai.com/api/) and OpenAI-compatible APIs
- [OpenAI](https://openai.com/api/) and compatible APIs
- [OpenRouter](https://openrouter.ai/)

You can also configure [CodeGate muxing](../features/muxing.md) to select your
provider and model using [workspaces](../features/workspaces.mdx).

## Install the Cline extension

Expand All @@ -42,10 +46,36 @@ in the VS Code documentation.

import ClineProviders from '../partials/_cline-providers.mdx';

:::note

Cline has two modes: Plan and Act. Each mode can be uniquely configured with a
different provider and model, so you need to configure both.

:::

To configure Cline to send requests through CodeGate:

1. Open the Cline extension sidebar from the VS Code Activity Bar and open its
settings using the gear icon.
1. Open the Cline extension sidebar from the VS Code Activity Bar. Note your
current mode, Plan or Act.

<ThemedImage
alt='Cline mode - plan'
sources={{
light: useBaseUrl('/img/integrations/cline-mode-plan-light.webp'),
dark: useBaseUrl('/img/integrations/cline-mode-plan-dark.webp'),
}}
width={'400px'}
/>
<ThemedImage
alt='Cline mode - act'
sources={{
light: useBaseUrl('/img/integrations/cline-mode-act-light.webp'),
dark: useBaseUrl('/img/integrations/cline-mode-act-dark.webp'),
}}
width={'400px'}
/>

1. Open the Cline settings using the gear icon.

<ThemedImage
alt='Cline extension settings'
Expand All @@ -60,7 +90,10 @@ To configure Cline to send requests through CodeGate:

<ClineProviders />

1. Click **Done** to save the settings.
1. Click **Done** to save the settings for your current mode.

1. Switch your Cline mode from Act to Plan or vice-versa, open the settings, and
repeat the configuration for your desired provider & model.

## Verify configuration

Expand Down
Loading