Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: make token counting optional in chat client #813

Merged
merged 1 commit into from
Feb 19, 2025

Conversation

deathbeam
Copy link
Collaborator

Currently token counting and limiting messages is enforced for all chat providers, but this should be optional as not all chat providers support token limiting. This change makes token counting optional and only enabled when max_tokens and tokenizer are provided in model config.

The types in providers.lua have also been updated to reflect the optional nature of these fields.

Closes #812

@deathbeam deathbeam added the bug Something isn't working label Feb 19, 2025
Currently token counting and limiting messages is enforced for all chat
providers, but this should be optional as not all chat providers support
token limiting. This change makes token counting optional and only enabled
when max_tokens and tokenizer are provided in model config.

The types in providers.lua have also been updated to reflect the optional
nature of these fields.

Signed-off-by: Tomas Slusny <[email protected]>
@deathbeam deathbeam merged commit dd9225f into CopilotC-Nvim:main Feb 19, 2025
1 of 2 checks passed
@deathbeam deathbeam deleted the fix-def branch February 19, 2025 10:36
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

max_tokens is nil when running local ollama:deepseek-r1
1 participant