Skip to content

Commit

Permalink
updated notebook examples
Browse files Browse the repository at this point in the history
  • Loading branch information
brainlid committed Jan 23, 2025
1 parent 1ed79b1 commit f9781d3
Show file tree
Hide file tree
Showing 3 changed files with 43 additions and 28 deletions.
22 changes: 13 additions & 9 deletions notebooks/context-specific-image-descriptions.livemd
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

```elixir
Mix.install([
{:langchain, github: "brainlid/langchain"},
{:langchain, "~> 0.3.0"},
{:kino, "~> 0.12.0"}
])
```
Expand Down Expand Up @@ -79,7 +79,7 @@ This is where we add **context** to our image description request. We'll assume

**NOTE:** Make sure the `:media` option matches both the image and what is supported by the LLM you are connecting with.

```elixir
````elixir
alias LangChain.Message
alias LangChain.Message.ContentPart
alias LangChain.PromptTemplate
Expand Down Expand Up @@ -117,17 +117,19 @@ messages = [
<%= @extra_image_info %>
Output in the following JSON format:
Output in the following format:
```json
{
"alt": "generated alt text",
"caption": "generation caption text"
}
```
"""),
ContentPart.image!(image_data, media: :jpg, detail: "low")
])
]
```
````

Before we continue, notice that the System message provides the general context for what we are doing and what we want from the LLM.

Expand Down Expand Up @@ -169,7 +171,7 @@ Everything is ready to make the request!

Now, we'll submit the request to the server and review the response. For this example, the "image_data_from_other_system" is a substitute for a database call or other lookup for the information we have on the image.

```elixir
````elixir
alias LangChain.Chains.LLMChain
alias LangChain.MessageProcessors.JsonProcessor

Expand All @@ -185,7 +187,7 @@ image_data_from_other_system = "image of urban art mural on underpass at 507 Kin
|> LLMChain.run(mode: :until_success)

updated_chain.last_message.processed_content
```
````

Notice that when running the chain, we use the option `mode: :until_success`. Some LLMs are better are generating valid JSON than other LLMs. When we included the `JsonProcesser`, it parses the assistant's content, converting it into an Elixir map. The converted data is stored on the `message.processed_content`.

Expand Down Expand Up @@ -225,12 +227,12 @@ Let's setup our Anthropic chat model.
```elixir
alias LangChain.ChatModels.ChatAnthropic

anthropic_chat_model = ChatAnthropic.new!(%{model: "claude-3-opus-20240229"})
anthropic_chat_model = ChatAnthropic.new!(%{model: "claude-3-5-sonnet-latest"})
```

Now we run the same messages through an identical LLMChain but passing in the Anthropic chat model.

```elixir
````elixir
alias LangChain.Chains.LLMChain
alias LangChain.MessageProcessors.JsonProcessor

Expand All @@ -246,7 +248,7 @@ image_data_from_other_system = "image of urban art mural on underpass at 507 Kin
|> LLMChain.run(mode: :until_success)

updated_chain.last_message.processed_content
```
````

Nice! The Elixir LangChain library abstracted away the differences between the two services. With no code changes, we can make a similar request about the image from Anthropic's Claude LLM as well!

Expand All @@ -262,3 +264,5 @@ Here's what I got from it:
```

We would want to run multiple tests on a small sampling of images and tweak our prompt until we are happy with the result. Then, we can process full batch and save our work as a template for future projects as well.

<!-- livebook:{"offset":12833,"stamp":{"token":"XCP.ima-5H7hVmSu3ukHhzotNc1v6kxQNfrdVgWq9fQbqecYia8FBsPHDdAnj9Kk-bZbTOj0Dm6xuwH0qQAJHp4aKwWZFcWb0V_5oRVwaD8NBLyAAq5Ih1RC1ksRmBMCJ8jH7B7tRfksKMMvbicRcpv-","version":2}} -->
39 changes: 25 additions & 14 deletions notebooks/custom_functions.livemd
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

```elixir
Mix.install([
{:langchain, "~> 0.3.0-rc.0"}
{:langchain, "~> 0.3.0"}
])
```

Expand Down Expand Up @@ -80,7 +80,8 @@ function =
name: "get_user_info",
description: "Return JSON object of the current users's relevant information.",
display_text: nil,
function: #Function<41.125776118/2 in :erl_eval.expr/6>,
strict: false,
function: #Function<41.18682967/2 in :erl_eval.expr/6>,
async: true,
parameters_schema: nil,
parameters: []
Expand Down Expand Up @@ -193,13 +194,17 @@ chat_model = ChatOpenAI.new!(%{model: "gpt-4o", temperature: 1, stream: false})
api_key: nil,
temperature: 1.0,
frequency_penalty: 0.0,
reasoning_mode: false,
reasoning_effort: "medium",
receive_timeout: 60000,
seed: nil,
n: 1,
json_response: false,
json_schema: nil,
stream: false,
max_tokens: nil,
stream_options: nil,
tool_choice: nil,
callbacks: [],
user: nil
}
Expand Down Expand Up @@ -259,13 +264,17 @@ LLM: %LangChain.ChatModels.ChatOpenAI{
api_key: nil,
temperature: 1.0,
frequency_penalty: 0.0,
reasoning_mode: false,
reasoning_effort: "medium",
receive_timeout: 60000,
seed: nil,
n: 1,
json_response: false,
json_schema: nil,
stream: false,
max_tokens: nil,
stream_options: nil,
tool_choice: nil,
callbacks: [],
user: nil
}
Expand Down Expand Up @@ -296,7 +305,8 @@ TOOLS: [
name: "get_user_info",
description: "Return JSON object of the current users's relevant information.",
display_text: nil,
function: #Function<41.125776118/2 in :erl_eval.expr/6>,
strict: false,
function: #Function<41.18682967/2 in :erl_eval.expr/6>,
async: true,
parameters_schema: nil,
parameters: []
Expand All @@ -313,7 +323,7 @@ SINGLE MESSAGE RESPONSE: %LangChain.Message{
%LangChain.Message.ToolCall{
status: :complete,
type: :function,
call_id: "call_Na42ylypjHbP2IkigPbDQBNV",
call_id: "call_2puDjavxfbFQ7uDvrE9vafnZ",
name: "get_user_info",
arguments: %{},
index: nil
Expand All @@ -332,7 +342,7 @@ MESSAGE PROCESSED: %LangChain.Message{
%LangChain.Message.ToolCall{
status: :complete,
type: :function,
call_id: "call_Na42ylypjHbP2IkigPbDQBNV",
call_id: "call_2puDjavxfbFQ7uDvrE9vafnZ",
name: "get_user_info",
arguments: %{},
index: nil
Expand All @@ -342,7 +352,7 @@ MESSAGE PROCESSED: %LangChain.Message{
}
EXECUTING FUNCTION: "get_user_info"
17:37:37.179 [debug] Executing function "get_user_info"
09:10:30.474 [debug] Executing function "get_user_info"
FUNCTION RESULT: "{\"name\":\"Joan Jett\",\"user_id\":2,\"account_type\":\"member\",\"favorite_animal\":\"Aardvark\"}"
TOOL RESULTS: %LangChain.Message{
content: nil,
Expand All @@ -355,16 +365,17 @@ TOOL RESULTS: %LangChain.Message{
tool_results: [
%LangChain.Message.ToolResult{
type: :function,
tool_call_id: "call_Na42ylypjHbP2IkigPbDQBNV",
tool_call_id: "call_2puDjavxfbFQ7uDvrE9vafnZ",
name: "get_user_info",
content: "{\"name\":\"Joan Jett\",\"user_id\":2,\"account_type\":\"member\",\"favorite_animal\":\"Aardvark\"}",
processed_content: nil,
display_text: nil,
is_error: false
}
]
}
SINGLE MESSAGE RESPONSE: %LangChain.Message{
content: "Joan's favorite, \nAardvark graces night with charm, \nSilent earth's delight.",
content: "Long snout seeking ants, \nNight wanderer of moonlight, \nAardvark's gentle grace.",
processed_content: nil,
index: 0,
status: :complete,
Expand All @@ -374,7 +385,7 @@ SINGLE MESSAGE RESPONSE: %LangChain.Message{
tool_results: nil
}
MESSAGE PROCESSED: %LangChain.Message{
content: "Joan's favorite, \nAardvark graces night with charm, \nSilent earth's delight.",
content: "Long snout seeking ants, \nNight wanderer of moonlight, \nAardvark's gentle grace.",
processed_content: nil,
index: 0,
status: :complete,
Expand All @@ -383,15 +394,15 @@ MESSAGE PROCESSED: %LangChain.Message{
tool_calls: [],
tool_results: nil
}
Joan's favorite,
Aardvark graces night with charm,
Silent earth's delight.
Long snout seeking ants,
Night wanderer of moonlight,
Aardvark's gentle grace.
```

<!-- livebook:{"output":true} -->

```
"Joan's favorite, \nAardvark graces night with charm, \nSilent earth's delight."
"Long snout seeking ants, \nNight wanderer of moonlight, \nAardvark's gentle grace."
```

**TIP:** Try changing the `context` to `user_id: 1` now and see what happens when a different user context is provided.
Expand All @@ -411,4 +422,4 @@ With this, we could expose functions that allow the LLM to request additional in

The rest is up to us.

<!-- livebook:{"offset":13586,"stamp":{"token":"XCP._l56WhgDkzWkHI2NYGvSJh4deP7F3zSqtFDIS5NIYuLaU7Xo7-QRUpNA8bYELuO7Byhz7UZGwLJtRDrszDpBLbduoG4NnFeLv4BHdJ1S4rcjqbu2vVNCrOA","version":2}} -->
<!-- livebook:{"offset":13867,"stamp":{"token":"XCP.UFYBKiTKTivrWqsgBsbdfATOyTtvmvReV-H7dpctAL0z8OepBUJOuQTNOu2-7E33qPbB-Jeyp-I32rgPfwdwE98Ed2JQRodZ3PL1Pn_lWFzuVrqSKbEGO8yYAUxoqCC8kpGjDeoKqIardsK0OtpE","version":2}} -->
10 changes: 5 additions & 5 deletions notebooks/getting_started.livemd
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

```elixir
Mix.install([
{:langchain, "~> 0.3.0-rc.0"},
{:langchain, "~> 0.3.0"},
{:kino, "~> 0.12.0"}
])
```
Expand Down Expand Up @@ -97,15 +97,15 @@ handler = %{
{:ok, updated_chain} =
%{
# llm config for streaming and the deltas callback
llm: ChatOpenAI.new!(%{model: "gpt-4o", stream: true, callbacks: [handler]}),
# chain callbacks
callbacks: [handler]
llm: ChatOpenAI.new!(%{model: "gpt-4o", stream: true})
}
|> LLMChain.new!()
|> LLMChain.add_messages([
Message.new_system!("You are a helpful assistant."),
Message.new_user!("Write a haiku about the capital of the United States")
])
# register the callbacks
|> LLMChain.add_callback(handler)
|> LLMChain.run()

updated_chain.last_message.content
Expand All @@ -125,4 +125,4 @@ Finally, once the full message is received, the chain's `on_message_processed` c

With the basics covered, you're ready to start integrating an LLM into your Elixir application! Check out other notebooks for more specific examples and other ways to use it.

<!-- livebook:{"offset":4645,"stamp":{"token":"XCP.asU02sHrPSt9UBUyNKE-Qa7G_WNBlxckoWay59yU1E9XLuliANsNDenOGjrq1uZmDm9a0vajWcrseaOSYqLHOegBXQ7fLmYNTULDwEYEmtLVmnTEKbYN8BnqGuuayL2HKO7GZ2dWye7NzpYSoSh5","version":2}} -->
<!-- livebook:{"offset":4654,"stamp":{"token":"XCP.i78jbBuPatFHM44quDBAU4hgc6ay4m6vFYa2oeCvONoMt0BoNyGTfJYcVJQVQuY1rU0KY3XceHNz07WamDRWUW60TxM82sMwVt99AzMKnXbiI-GAzL4fJhKlGsVBAX0NB0zACWWvoFi4eVu2RsMi","version":2}} -->

0 comments on commit f9781d3

Please sign in to comment.