Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: offline support for tiktoken #7588

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

Hexoplon
Copy link
Contributor

@Hexoplon Hexoplon commented Jan 6, 2025

Title

Implement pre-download of tiktoken tokenizer file for the non_root offline image

Relevant issues

N/A

Type

🆕 New Feature

Changes

  • In Dockerfile.non_root, invoke tiktoken with custom cache dir specified to download during build.
  • Set the CUSTOM_TIKTOKEN_CACHE_DIR environment value in the final image so litellm will set the correct offline cache dir
  • Update documentation for offline deployment to mention this change

[REQUIRED] Testing - Attach a screenshot of any new tests passing locally

Tested by running openai / non recognized model, which defaults to tiktoken

Future work

Pre-download llama-2 and llama-3 tokenizers. Cohere and Anthropic ones are probably not required, as these are not available for self hosting without internet connection anyways.

Copy link

vercel bot commented Jan 6, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jan 6, 2025 8:03pm

@Hexoplon Hexoplon changed the title Feat/offline tiktoken feat: offline support for tiktoken Jan 6, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant