Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Langchain integration #28

Open
mathieuisabel opened this issue Jul 22, 2024 · 3 comments
Open

Langchain integration #28

mathieuisabel opened this issue Jul 22, 2024 · 3 comments

Comments

@mathieuisabel
Copy link

How can RouteLLM be used with LangChain?

@iojw
Copy link
Collaborator

iojw commented Jul 22, 2024

Hi! Do you have a specific use case in mind? We don't currently have a LangChain integration but I assume RouteLLM could be used in-place of model calls, where instead of just using GPT-4, you would have a router that routed between two models and returns the response.

@mathieuisabel
Copy link
Author

mathieuisabel commented Jul 22, 2024

I would think more as a drop-in replacement for the models. i.e. right now you can set llm as either ChatOpenAI or ChatAnthropic and you don't have to change the definition of the chain itself. i.e.
llm = ChatOpenAI( model_name=model_name, temperature=0.0, openai_api_key=os.getenv('OpenAI__ApiKey'), max_retries=3, request_timeout=240 )
or
llm = ChatAnthropic( model_name=model_name, temperature=0.0, api_key=os.getenv('Anthropic__ApiKey'), max_retries=3, timeout=10 )
and then:

This one touches on function calling see #20 as a consideration for drop-in replacement

chain = prompt | llm.bind_tools([structure_segment_function]) response=chain.invoke(chain_payload)

@iojw
Copy link
Collaborator

iojw commented Jul 28, 2024

Yes exactly, a drop-in replacement for models is what I had in mind. We'd be happy to accept any contributions here!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants