Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use AWS Bedrock Models #2669

Open
nrcoleman opened this issue Feb 3, 2025 · 0 comments
Open

Use AWS Bedrock Models #2669

nrcoleman opened this issue Feb 3, 2025 · 0 comments

Comments

@nrcoleman
Copy link

nrcoleman commented Feb 3, 2025

I am trying to use lm-eval-harness to evaluate Claude Sonnet, Haiku, Mistral, Meta Llama, and Titan models. I am accessing these models through AWS Bedrock. In anthropic_llms.py, there is a function called self.client.get_tokenizer() that gets the Anthropic tokenizer. But, I am using an AWS Bedrock client instead of an Anthropic client. How do I get access to this tokenizer without having an Anthropic OpenAI key? The tokenizer does not appear to be part of tiktoken or HuggingFace. Is there a different way to run lm-eval-harness on Amazon Bedrock Models

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant