-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support more EvalClient formats based on LLM APIs #176
Comments
If you are interested, it would be a good project to pick one of the services and implement an Here is the reference implementation of |
Should we add Vertex AI near the top of the list as well? |
Sounds reasonable, added!! |
By the way, does langcheck support OpenRouter? It is a relatively popular platform for querying LLMs. |
Wow I didn't know this service, but it looks pretty cool!! |
There are more and more services in which you can call LLMs with HTTP requests. Integrating those services into LangCheck in the form of
EvalClient
could expand the userbase of LangCheck.I did a quick research of services we can potentially support:
LLM hosting services
APIs provided by model developers
(I personally feel items listed on top are more important, but it is very subjective and unreliable)
We can split the work to investigate the API spec & implement the EvalClient for each service!!
The text was updated successfully, but these errors were encountered: