diff --git a/docs/about/changelog.md b/docs/about/changelog.md
index 67b1293..81a2292 100644
--- a/docs/about/changelog.md
+++ b/docs/about/changelog.md
@@ -13,6 +13,11 @@ Major features and changes are noted here. To review all updates, see the
Related: [Upgrade CodeGate](../how-to/install.md#upgrade-codegate)
+- **Muxing filter rules** - 18 Feb, 2025\
+ CodeGate v0.1.23 adds filter rules for model muxing, allowing you to define
+ which model should be used for a given file type. See the
+ [model muxing docs](../features/muxing.mdx) for more.
+
- **PII redaction:** - 10 Feb, 2025\
Starting with v0.1.18, CodeGate now redacts personally identifiable
information (PII) found in LLM prompts and context. See the
@@ -21,7 +26,7 @@ Related: [Upgrade CodeGate](../how-to/install.md#upgrade-codegate)
- **Model muxing** - 7 Feb, 2025\
With CodeGate v0.1.17 you can use the new `/v1/mux` endpoint to configure
model selection based on your workspace! Learn more in the
- [model muxing guide](../features/muxing.md).
+ [model muxing guide](../features/muxing.mdx).
- **OpenRouter endpoint** - 7 Feb, 2025\
CodeGate v0.1.17 adds a dedicated `/openrouter` provider endpoint for
diff --git a/docs/features/muxing.md b/docs/features/muxing.mdx
similarity index 62%
rename from docs/features/muxing.md
rename to docs/features/muxing.mdx
index eced281..09a4903 100644
--- a/docs/features/muxing.md
+++ b/docs/features/muxing.mdx
@@ -1,8 +1,11 @@
---
title: Model muxing
-description: Configure a per-workspace LLM
+description: Pick the right LLM for the job
---
+import useBaseUrl from '@docusaurus/useBaseUrl';
+import ThemedImage from '@theme/ThemedImage';
+
## Overview
_Model muxing_ (or multiplexing), allows you to configure your AI assistant once
@@ -11,12 +14,13 @@ and models without reconfiguring your development environment. This feature is
especially useful when you're working on multiple projects or tasks that require
different AI models.
-For each CodeGate workspace, you can select the AI provider and model
-combination you want to use. Then, configure your AI coding tool to use the
+In each of your CodeGate workspaces, you can select the AI provider and model
+combinations to use, even dynamically switching the active model based on file
+types found in your prompt. Then, configure your AI coding tool to use the
CodeGate muxing endpoint `http://localhost:8989/v1/mux` as an OpenAI-compatible
API provider.
-To change the model currently in use, simply switch your active CodeGate
+To change the model(s) currently in use, simply switch your active CodeGate
workspace.
```mermaid
@@ -44,6 +48,9 @@ flowchart LR
- You have a project that requires a specific model for a particular task, but
you also need to switch between different models during the course of your
work.
+- You're working in a monorepo with several different languages/file types and
+ want to dynamically switch to an optimal model as you move between different
+ parts of the codebase.
- You want to experiment with different LLM providers and models without having
to reconfigure your AI assistant/agent every time you switch.
- Your AI coding assistant doesn't support a particular provider or model that
@@ -58,7 +65,7 @@ flowchart LR
## Configure muxing
To use muxing with your AI coding assistant, you need to add one or more AI
-providers to CodeGate, then select the model you want to use on a workspace.
+providers to CodeGate, then select the model(s) you want to use on a workspace.
CodeGate supports the following LLM providers for muxing:
@@ -93,13 +100,47 @@ For locally-hosted models, you must use `http://host.docker.internal` instead of
:::
-### Select the model for a workspace
+### Configure workspace models
Open the settings of one of your [workspaces](./workspaces.mdx) from the
-Workspace selection menu or the
+workspace selection menu or the
[Manage Workspaces](http://localhost:9090/workspaces) screen.
-In the **Preferred Model** section, select the model to use with the workspace.
+In the **Model Muxing** section, select the default ("catch-all") model to use
+with the workspace.
+
+To assign a different model based on filename, click **Add Filter**. In the
+**Filter by** column, enter a file name or extension string to match. This is a
+simple substring match, wildcards are not supported. For example, to match
+Python files, enter `.py`. Then select the model to use with that file type.
+
+Filter rules are evaluated top-down. CodeGate selects the active model for a
+request using the first matching rule. If the prompt contains multiple files in
+context, the first rule that matches _any_ of the files is used. If no filter is
+matched, the catch-all rule applies.
+
+
+_An example showing several muxing rules for different file types_
+
+Breaking down the above example:
+
+- Markdown files (`.md`) use the gpt-4o-mini model from the OpenAI provider.
+- JavaScript and TypeScript files (`.js` and `.ts`, which also matches `.jsx`
+ and `.tsx`) use anthropic/claude-3.5-sonnet via OpenRouter.
+- All other requests use Ollama.
+- A request containing both a JavaScript and Markdown file will match the `.md`
+ rule first and use OpenAI.
+
+You can validate which provider was used for a given request by checking the
+**conversation summary** in the CodeGate dashboard.
### Manage existing providers
diff --git a/docs/features/workspaces.mdx b/docs/features/workspaces.mdx
index 6ea63b2..7e256b7 100644
--- a/docs/features/workspaces.mdx
+++ b/docs/features/workspaces.mdx
@@ -27,7 +27,7 @@ Workspaces offer several key features:
for different types of tasks. Choose from CodeGate's library of community
prompts or create your own.
-- [**Model muxing**](./muxing.md): Configure the LLM provider/model for each
+- [**Model muxing**](./muxing.mdx): Configure the LLM provider/model for each
workspace, allowing you to configure your AI assistant/agent once and switch
between different models on the fly. This is useful when working on multiple
projects or tasks that require different AI models.
@@ -115,7 +115,7 @@ In the workspace list, open the menu (**...**) next to a workspace to
**Activate**, **Edit**, or **Archive** the workspace.
**Edit** opens the workspace settings page. From here you can rename the
-workspace, select the LLM provider and model (see [Model muxing](./muxing.md)),
+workspace, select the LLM provider and model (see [Model muxing](./muxing.mdx)),
set the custom prompt instructions, or archive the workspace.
**Archived** workspaces can be restored or permanently deleted from the
diff --git a/docs/index.md b/docs/index.md
index 4ecd995..003632c 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -41,7 +41,7 @@ efficiency, including:
information
- [Dependency risk awareness](./features/dependency-risk.md) to update the LLM's
knowledge of malicious or deprecated open source packages
-- [Model muxing](./features/muxing.md) to quickly select the best LLM
+- [Model muxing](./features/muxing.mdx) to quickly select the best LLM
provider/model for your current task
- [Workspaces](./features/workspaces.mdx) to organize and customize your LLM
interactions
diff --git a/docs/integrations/aider.mdx b/docs/integrations/aider.mdx
index f943d5e..2e54293 100644
--- a/docs/integrations/aider.mdx
+++ b/docs/integrations/aider.mdx
@@ -17,7 +17,7 @@ CodeGate works with the following AI model providers through aider:
- Hosted:
- [OpenAI](https://openai.com/api/) and OpenAI-compatible APIs
-You can also configure [CodeGate muxing](../features/muxing.md) to select your
+You can also configure [CodeGate muxing](../features/muxing.mdx) to select your
provider and model using [workspaces](../features/workspaces.mdx).
:::note
diff --git a/docs/integrations/avante.mdx b/docs/integrations/avante.mdx
index 5b89a9f..a6d4b9f 100644
--- a/docs/integrations/avante.mdx
+++ b/docs/integrations/avante.mdx
@@ -65,7 +65,7 @@ server.
### Model muxing
-To take advantage of CodeGate's [model muxing feature](../features/muxing.md),
+To take advantage of CodeGate's [model muxing feature](../features/muxing.mdx),
use **avante.nvim**'s OpenAI provider with the following configuration:
```lua
diff --git a/docs/integrations/cline.mdx b/docs/integrations/cline.mdx
index 0e58742..f6d36c0 100644
--- a/docs/integrations/cline.mdx
+++ b/docs/integrations/cline.mdx
@@ -21,7 +21,7 @@ CodeGate works with the following AI model providers through Cline:
- [OpenAI](https://openai.com/api/) and compatible APIs
- [OpenRouter](https://openrouter.ai/)
-You can also configure [CodeGate muxing](../features/muxing.md) to select your
+You can also configure [CodeGate muxing](../features/muxing.mdx) to select your
provider and model using [workspaces](../features/workspaces.mdx).
## Install the Cline extension
diff --git a/docs/integrations/continue.mdx b/docs/integrations/continue.mdx
index 806350b..5c797b2 100644
--- a/docs/integrations/continue.mdx
+++ b/docs/integrations/continue.mdx
@@ -25,7 +25,7 @@ CodeGate works with the following AI model providers through Continue:
- [OpenAI](https://openai.com/api/)
- [OpenRouter](https://openrouter.ai/)
-You can also configure [CodeGate muxing](../features/muxing.md) to select your
+You can also configure [CodeGate muxing](../features/muxing.mdx) to select your
provider and model using [workspaces](../features/workspaces.mdx).
## Install the Continue plugin
@@ -129,7 +129,7 @@ to the pre-release version (v0.9.x) of the Continue extension.
:::
-First, configure your [provider(s)](../features/muxing.md#add-a-provider) and
+First, configure your [provider(s)](../features/muxing.mdx#add-a-provider) and
select a model for each of your
[workspace(s)](../features/workspaces.mdx#manage-workspaces) in the CodeGate
dashboard.
diff --git a/docs/integrations/kodu.mdx b/docs/integrations/kodu.mdx
index 55c0842..ab759ef 100644
--- a/docs/integrations/kodu.mdx
+++ b/docs/integrations/kodu.mdx
@@ -14,7 +14,7 @@ take their project from idea to execution.
CodeGate supports OpenAI-compatible APIs and OpenRouter with Claude Coder.
-You can also configure [CodeGate muxing](../features/muxing.md) to select your
+You can also configure [CodeGate muxing](../features/muxing.mdx) to select your
provider and model using [workspaces](../features/workspaces.mdx).
## Install the Claude Coder extension
diff --git a/docs/integrations/open-interpreter.mdx b/docs/integrations/open-interpreter.mdx
index e551599..2356417 100644
--- a/docs/integrations/open-interpreter.mdx
+++ b/docs/integrations/open-interpreter.mdx
@@ -14,7 +14,7 @@ LLMs run code locally through a ChatGPT-like interface in your terminal.
CodeGate works with [OpenAI](https://openai.com/api/) and compatible APIs
through Open Interpreter.
-You can also configure [CodeGate muxing](../features/muxing.md) to select your
+You can also configure [CodeGate muxing](../features/muxing.mdx) to select your
provider and model using [workspaces](../features/workspaces.mdx).
:::note
@@ -33,7 +33,7 @@ set to CodeGate's local API port, `http://localhost:8989/`.
-First, configure your [provider(s)](../features/muxing.md#add-a-provider) and
+First, configure your [provider(s)](../features/muxing.mdx#add-a-provider) and
select a model for each of your
[workspace(s)](../features/workspaces.mdx#manage-workspaces) in the CodeGate dashboard.
diff --git a/docs/partials/_aider-providers.mdx b/docs/partials/_aider-providers.mdx
index bff3593..333f42e 100644
--- a/docs/partials/_aider-providers.mdx
+++ b/docs/partials/_aider-providers.mdx
@@ -7,7 +7,7 @@ import LocalModelRecommendation from './_local-model-recommendation.md';
-First, configure your [provider(s)](../features/muxing.md#add-a-provider) and
+First, configure your [provider(s)](../features/muxing.mdx#add-a-provider) and
select a model for each of your
[workspace(s)](../features/workspaces.mdx#manage-workspaces) in the CodeGate dashboard.
diff --git a/docs/partials/_cline-providers.mdx b/docs/partials/_cline-providers.mdx
index 8f5d87d..1b9fb11 100644
--- a/docs/partials/_cline-providers.mdx
+++ b/docs/partials/_cline-providers.mdx
@@ -9,7 +9,7 @@ import LocalModelRecommendation from './_local-model-recommendation.md';
-First, configure your [provider(s)](../features/muxing.md#add-a-provider) and
+First, configure your [provider(s)](../features/muxing.mdx#add-a-provider) and
select a model for each of your
[workspace(s)](../features/workspaces.mdx#manage-workspaces) in the CodeGate dashboard.
diff --git a/docs/partials/_kodu-providers.mdx b/docs/partials/_kodu-providers.mdx
index 4a4788e..5036b5e 100644
--- a/docs/partials/_kodu-providers.mdx
+++ b/docs/partials/_kodu-providers.mdx
@@ -13,7 +13,7 @@ expect to release in CodeGate v0.1.19.
:::
-First, configure your [provider(s)](../features/muxing.md#add-a-provider) and
+First, configure your [provider(s)](../features/muxing.mdx#add-a-provider) and
select a model for each of your
[workspace(s)](../features/workspaces.mdx#manage-workspaces) in the CodeGate
dashboard.
diff --git a/static/img/features/muxing-rules-dark.webp b/static/img/features/muxing-rules-dark.webp
new file mode 100644
index 0000000..f016b93
Binary files /dev/null and b/static/img/features/muxing-rules-dark.webp differ
diff --git a/static/img/features/muxing-rules-light.webp b/static/img/features/muxing-rules-light.webp
new file mode 100644
index 0000000..87311d7
Binary files /dev/null and b/static/img/features/muxing-rules-light.webp differ