-
Notifications
You must be signed in to change notification settings - Fork 34
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Support various LLM providers (Ollama, Groq, Together...) + docs (#75)
* Support Replicate-hosted LLMs * Add support for Together, Groq, and Ollama LLMs in Langchain and LlamaIndex integration. * Replace `OllamaLLM` with `ChatOllama` in `langchain_ollama_llm` function. * Fix import path for ChatOllama in langchain_ollama_llm function. * LLM-related docs & refactoring
- Loading branch information
Showing
10 changed files
with
287 additions
and
190 deletions.
There are no files selected for viewing
This file was deleted.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,108 @@ | ||
Choosing LLMs | ||
==================== | ||
|
||
Generally, the interaction with an LLM is up to the agent implementation. | ||
However, as motleycrew integrates with several agent frameworks, there is some common ground for how to choose LLMs. | ||
|
||
|
||
Providing an LLM to an agent | ||
---------------------------- | ||
|
||
In general, you can pass a specific LLM to the agent you're using. | ||
|
||
.. code-block:: python | ||
from motleycrew.agents.langchain import ReActToolCallingMotleyAgent | ||
from langchain_openai import ChatOpenAI | ||
llm = ChatOpenAI(model="gpt-4o", temperature=0) | ||
agent = ReActToolCallingMotleyAgent(llm=llm, tools=[...]) | ||
The LLM class depends on the agent framework you're using. | ||
That's why we have an ``init_llm`` function to help you set up the LLM. | ||
|
||
.. code-block:: python | ||
from motleycrew.common.llms import init_llm | ||
from motleycrew.common import LLMFramework, LLMProvider | ||
llm = init_llm( | ||
llm_framework=LLMFramework.LANGCHAIN, | ||
llm_provider=LLMProvider.ANTHROPIC, | ||
llm_name="claude-3-5-sonnet-20240620", | ||
llm_temperature=0 | ||
) | ||
agent = ReActToolCallingMotleyAgent(llm=llm, tools=[...]) | ||
The currently supported frameworks (:py:class:`motleycrew.common.enums.LLMFramework`) are: | ||
|
||
- :py:class:`Langchain <motleycrew.common.enums.LLMFramework.LANGCHAIN>` for Langchain-based agents from Langchain, CrewAI, motelycrew etc. | ||
- :py:class:`LlamaIndex <motleycrew.common.enums.LLMFramework.LLAMA_INDEX>` for LlamaIndex-based agents. | ||
|
||
The currently supported LLM providers (:py:class:`motleycrew.common.enums.LLMProvider`) are: | ||
|
||
- :py:class:`OpenAI <motleycrew.common.enums.LLMProvider.OPENAI>` | ||
- :py:class:`Anthropic <motleycrew.common.enums.LLMProvider.ANTHROPIC>` | ||
- :py:class:`Groq <motleycrew.common.enums.LLMProvider.GROQ>` | ||
- :py:class:`Together <motleycrew.common.enums.LLMProvider.TOGETHER>` | ||
- :py:class:`Replicate <motleycrew.common.enums.LLMProvider.REPLICATE>` | ||
- :py:class:`Ollama <motleycrew.common.enums.LLMProvider.OLLAMA>` | ||
|
||
Please raise an issue if you need to add support for another LLM provider. | ||
|
||
|
||
Default LLM | ||
----------- | ||
|
||
At present, we default to OpenAI's latest ``gpt-4o`` model for our agents, | ||
and rely on the user to set the `OPENAI_API_KEY` environment variable. | ||
|
||
You can control the default LLM as follows: | ||
|
||
.. code-block:: python | ||
from motleycrew.common import Defaults | ||
Defaults.DEFAULT_LLM_PROVIDE = "the_new_default_LLM_provider" | ||
Defaults.DEFAULT_LLM_NAME = "name_of_the_new_default_model_from_the_provider" | ||
Using custom LLMs | ||
----------------- | ||
|
||
To use a custom LLM provider to use as the default or via the ``init_llm`` function, | ||
you need to make sure that for all the frameworks you're using (currently at most Langchain, LlamaIndex), | ||
the `LLM_MAP` has an entry for the LLM provider, for example as follows: | ||
|
||
.. code-block:: python | ||
from motleycrew.common import LLMProvider | ||
from motleycrew.common.llms import LLM_MAP | ||
LLM_MAP[(LLMFramework.LANGCHAIN, "MyLLMProvider")] = my_langchain_llm_factory | ||
LLM_MAP[(LLMFramework.LLAMA_INDEX, "MyLLMProvider")] = my_llamaindex_llm_factory | ||
Here each llm factory is a function with a signature | ||
``def llm_factory(llm_name: str, llm_temperature: float, **kwargs)`` that returns the model object for the relevant framework. | ||
|
||
For example, this is the built-in OpenAI model factory for Langchain: | ||
|
||
.. code-block:: python | ||
def langchain_openai_llm( | ||
llm_name: str = Defaults.DEFAULT_LLM_NAME, | ||
llm_temperature: float = Defaults.DEFAULT_LLM_TEMPERATURE, | ||
**kwargs, | ||
): | ||
from langchain_openai import ChatOpenAI | ||
return ChatOpenAI(model=llm_name, temperature=llm_temperature, **kwargs) | ||
You can also overwrite the `LLM_MAP` values for e.g. the OpenAI models if, for example, | ||
you want to use an in-house wrapper for Langchain or Llamaindex model adapters | ||
(for example, to use an internal gateway instead of directly hitting the OpenAI endpoints). | ||
|
||
Note that at present, if you use Autogen with motleycrew, you will need to separately control | ||
the models that Autogen uses, using the Autogen-specific APIs. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -9,5 +9,5 @@ Usage | |
key_concepts | ||
agents | ||
knowledge_graph | ||
another_llm | ||
choosing_llms | ||
caching_observability |
This file was deleted.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.