Skip to content

Commit

Permalink
Big update - RC ready (#138)
Browse files Browse the repository at this point in the history
* functions for adding callbacks
- LLMChain supports adding a callback to it's model
- makes it easier when restoring a model via config or from a router and hooking up a callback chain

* chat models support config serialization
- closes #104 discussion
- ChatModel top-level functions
- ChatOpenAI
- ChatAnthropic
- ChatBumblebee
- Utils functions
- ChatGoogleAI - added callback support (I think)
- ChatVertexAI

* ollama updated for initial support of callbacks
- supports serializing

* updated ChatMistralAI
- callbacks support
- new api
- serializing

* updated req

* minor update
  • Loading branch information
brainlid authored Jun 18, 2024
1 parent a6af118 commit 3f0d1ad
Show file tree
Hide file tree
Showing 23 changed files with 931 additions and 111 deletions.
9 changes: 9 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,14 @@
* The attribute `processed_content` was added to a `LangChain.Message`. When a MessageProcessor is run on a received assistant message, the results of the processing are accumulated there. The original `content` remains unchanged for when it is sent back to the LLM and used when fixing or correcting it's generated content.
* Callback support for LLM ratelimit information returned in API response headers. These are currently implemented for Anthropic and OpenAI.
* Callback support for LLM token usage information returned when available.
* `LangChain.ChatModels.ChatModel` additions
* Added `add_callback/2` makes it easier to add a callback to an chat model.
* Added `serialize_config/1` to serialize an LLM chat model configuration to a map that can be restored later.
* Added `restore_from_map/1` to restore a configured LLM chat model from a database (for example).
* `LangChain.Chain.LLMChain` additions
* New function `add_callback/2` makes it easier to add a callback to an existing `LLMChain`.
* New function `add_llm_callback/2` makes it easier to add a callback to a chain's LLM. This is particularly useful when an LLM model is restored from a database when loading a past conversation and wanting to preserve the original configuration.


**Changed:**

Expand All @@ -22,6 +30,7 @@
* Many smaller changes and contributions were made. This includes updates to the README for clarity,
* `LangChain.Utils.fire_callback/3` was refactored into `LangChain.Utils.fire_streamed_callback/2` where it is only used for processing deltas and uses the new callback mechanism.
* Notebooks were moved to the separate demo project
* `LangChain.ChatModels.ChatGoogleAI`'s key `:version` was changed to `:api_version` to be more consistent with other models and allow for model serializers to use the `:version` key.

### Migrations Steps

Expand Down
10 changes: 10 additions & 0 deletions lib/chains/llm_chain.ex
Original file line number Diff line number Diff line change
Expand Up @@ -45,6 +45,7 @@ defmodule LangChain.Chains.LLMChain do
use Ecto.Schema
import Ecto.Changeset
require Logger
alias LangChain.ChatModels.ChatModel
alias LangChain.Callbacks
alias LangChain.Chains.ChainCallbacks
alias LangChain.PromptTemplate
Expand Down Expand Up @@ -819,6 +820,15 @@ defmodule LangChain.Chains.LLMChain do
%LLMChain{chain | callbacks: callbacks ++ [additional_callback]}
end

@doc """
Add a `LangChain.ChatModels.LLMCallbacks` callback map to the chain's `:llm` model if
it supports the `:callback` key.
"""
@spec add_llm_callback(t(), map()) :: t()
def add_llm_callback(%LLMChain{llm: model} = chain, callback_map) do
%LLMChain{chain | llm: ChatModel.add_callback(model, callback_map)}
end

# a pipe-friendly execution of callbacks that returns the chain
defp fire_callback_and_return(%LLMChain{} = chain, callback_name, additional_arguments)
when is_list(additional_arguments) do
Expand Down
33 changes: 33 additions & 0 deletions lib/chat_models/chat_anthropic.ex
Original file line number Diff line number Diff line change
Expand Up @@ -58,6 +58,8 @@ defmodule LangChain.ChatModels.ChatAnthropic do

@behaviour ChatModel

@current_config_version 1

# allow up to 1 minute for response.
@receive_timeout 60_000

Expand Down Expand Up @@ -866,4 +868,35 @@ defmodule LangChain.ChatModels.ChatAnthropic do
end

defp get_token_usage(_response_body), do: %{}

@doc """
Generate a config map that can later restore the model's configuration.
"""
@impl ChatModel
@spec serialize_config(t()) :: %{String.t() => any()}
def serialize_config(%ChatAnthropic{} = model) do
Utils.to_serializable_map(
model,
[
:endpoint,
:model,
:api_version,
:temperature,
:max_tokens,
:receive_timeout,
:top_p,
:top_k,
:stream
],
@current_config_version
)
end

@doc """
Restores the model from the config.
"""
@impl ChatModel
def restore_from_map(%{"version" => 1} = data) do
ChatAnthropic.new(data)
end
end
45 changes: 45 additions & 0 deletions lib/chat_models/chat_bumblebee.ex
Original file line number Diff line number Diff line change
Expand Up @@ -112,6 +112,8 @@ defmodule LangChain.ChatModels.ChatBumblebee do

@behaviour ChatModel

@current_config_version 1

@primary_key false
embedded_schema do
# Name of the Nx.Serving to use when working with the LLM.
Expand Down Expand Up @@ -164,6 +166,7 @@ defmodule LangChain.ChatModels.ChatBumblebee do
def new(%{} = attrs \\ %{}) do
%ChatBumblebee{}
|> cast(attrs, @create_fields)
|> restore_serving_if_string()
|> common_validation()
|> apply_action(:insert)
end
Expand All @@ -182,6 +185,22 @@ defmodule LangChain.ChatModels.ChatBumblebee do
end
end

defp restore_serving_if_string(changeset) do
case get_field(changeset, :serving) do
value when is_binary(value) ->
case Utils.module_from_name(value) do
{:ok, module} ->
put_change(changeset, :serving, module)

{:error, reason} ->
add_error(changeset, :serving, reason)
end

_other ->
changeset
end
end

defp common_validation(changeset) do
changeset
|> validate_required(@required_fields)
Expand Down Expand Up @@ -322,4 +341,30 @@ defmodule LangChain.ChatModels.ChatBumblebee do
end

defp fire_token_usage_callback(_model, _token_summary), do: :ok

@doc """
Generate a config map that can later restore the model's configuration.
"""
@impl ChatModel
@spec serialize_config(t()) :: %{String.t() => any()}
def serialize_config(%ChatBumblebee{} = model) do
Utils.to_serializable_map(
model,
[
:serving,
:template_format,
:stream,
:seed
],
@current_config_version
)
end

@doc """
Restores the model from the config.
"""
@impl ChatModel
def restore_from_map(%{"version" => 1} = data) do
ChatBumblebee.new(data)
end
end
Loading

0 comments on commit 3f0d1ad

Please sign in to comment.