diff --git a/docs/cody/faq.mdx b/docs/cody/faq.mdx index fee3d1dff..fd701e497 100644 --- a/docs/cody/faq.mdx +++ b/docs/cody/faq.mdx @@ -137,6 +137,21 @@ cody chat --model '$name_of_the_model' -m 'Hi Cody!' For example, to use Claude 3.5 Sonnet, you'd pass the following command in your terminal, `cody chat --model 'claude-3.5-sonnet' -m 'Hi Cody!' +### Can I configure custom context windows for API-only access? + +Yes, it's possible to configure a custom context window exclusively for programmatic access via the API. To achieve this you need to use the custom [model configuration](https://sourcegraph.com/docs/cody/enterprise/model-configuration#model-overrides) and set the "capabilities" list as empty in the model configuration. + +Here's an example config + + "contextWindow": { + // Specifies the maximum number of tokens for the contextual data in the prompt (e.g., question, relevant snippets) + "maxInputTokens": 45000, + // Specifies the maximum number of tokens allowed in the response + "maxOutputTokens": 8000 + }, + // Set the "capabilities" list as empty to ensure the model is not available on Cody Web or IDEs + "capabilities": [], + ## OpenAI o1 ### What are OpenAI o1 best practices?