Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OpenAI Batch API #1286

Open
luarss opened this issue Jan 19, 2025 · 1 comment
Open

OpenAI Batch API #1286

luarss opened this issue Jan 19, 2025 · 1 comment

Comments

@luarss
Copy link
Contributor

luarss commented Jan 19, 2025

Is your feature request related to a problem? Please describe.
This is closely linked to #500, which (correct me if i am wrong) should already be implemented, but not via the Batch API. Integrating the batch API will save evaluations by costs up to 50% lower costs 1, 2.

Describe the solution you'd like
Ideally have a switch that allows user to select batch_mode which is available only if user has selected a valid model that has batch API support (e.g. openai/gemini)

Describe alternatives you've considered
n/a

Additional context
n/a

@penguine-ip
Copy link
Contributor

Hey @luarss will try to look at this a bit later, batch mode right now in deepeval is the most best supported since we're not the most familiar with it. If you have any suggestions or PRs, would also appreciate it greatly

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants