Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

M5 Update #7

Merged
merged 2 commits into from
Jan 15, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
45 changes: 44 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,49 @@
# Changelog

## 2024-09-26 - V1.0.0
## 2025-01-15

- OllamaOptions use .builder() instead of create()
- Refactor client creation (with options and system prompt)
author : [email protected]

## 2025-01-07

Add quickstart links for GitHub Codespaces and GitPod
author : [email protected]

## 2025-01-02

### Bump

- Spring AI 1.0.0-M5
- SpringBoot 3.4.1
- SpringShell 3.4.0
- Lombok 1.18.34

### Docs

- Add snippets in exercises
- Update conclusions

author : [email protected]

## 2024-12-13

Spring AI 1.0.0-M4
author : [email protected]

## 2024-10-14

Spring AI 1.0.0-M3 : temperature is now double
Fix prompt system content to avoid unnecessary line-breaks in response with llmctx command (exercise 4)
author : [email protected]

## 2024-10-01

GitHub build action and code owners setup
author : [email protected]

## 2024-09-26

First public release with Spring AI 1.0.0-M2 and Spring Boot 3.3.2
author : [email protected]
36 changes: 18 additions & 18 deletions workshop/exercise-1.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,18 +10,16 @@ Modify the `LLMService` class.

We will use a `ChatClient` object to interact with the LLM. This object can be built with `ChatClient.Builder` already instantiated thanks to autoconfiguration.

Create a private final attribute `ChatClient` named chatClient.
Create a private final attribute `SystemMessage` named systemMessage.

In the LLMService constructor, set chatClient with the result of calling `build()` on the builder.
Create a private final attribute `ChatClient` named `chatClient`.
In the LLMService constructor, set `chatClient` with the result of calling `.defaultSystem(promptSystem).build()` on `builder`.

```java
private final ChatClient chatClient;
private final SystemMessage systemMessage;

public LLMService(ChatClient.Builder builder, @Value("classpath:/prompt-system.md") Resource promptSystem) {
this.systemMessage = new SystemMessage(promptSystem);
this.chatClient = builder.build();
this.chatClient = builder
.defaultSystem(promptSystem)
.build();
}
```

Expand All @@ -35,36 +33,38 @@ Please answer the question asked and provide the shortest possible response with

### Part 3 - Create query options object

Create a `OllamaOptions` attribute and initialize it in the constructor by using `OllamaOptions.create()` method and set model to `mistral:7b` and temperature to `0.8`.
Create a `OllamaOptions` attribute and initialize it in the constructor by using `OllamaOptions.builder()` method and build with model `mistral:7b` and temperature `0.8`.

```java
this.options = OllamaOptions.create()
.withModel("mistral:7b")
.withTemperature(0.8);
this.options = OllamaOptions.builder()
.model("mistral:7b")
.temperature(0.8)
.build();
```

### Part 4 - Implement the model query in streaming mode

Complete the existing `getResponse` method with the following steps:

1. create a new `Prompt` object using `Prompt(List<Message> messages, OllamaOptions options)` constructor. Pass the previously created objects as arguments, where SystemMessage and UserMessage are included in a list, along with the OllamaOptions object.
2. call `chatClient.stream` method by passing the `Prompt` object as argument
1. create a new `Prompt` object using `Prompt(List<Message> messages)` constructor. Pass the previously created object as argument.
2. call `chatClient.stream` method by passing the `Prompt` object and the `options` attribute in builder pattern
3. map and return `chatClient.stream` result

```java
private Stream<String> getResponse(final Message userMessage) {

List<Message> messages = new ArrayList<>();
messages.add(systemMessage);
messages.add(userMessage);

Prompt prompt = new Prompt(messages, options);
return chatClient.prompt(prompt).stream()

Prompt prompt = new Prompt(messages);
return chatClient.prompt(prompt)
.options(options)
.stream()
.chatResponse().toStream()
.map(ChatResponse::getResults)
.flatMap(List::stream)
.map(Generation::getOutput)
.map(AssistantMessage::getContent);
.map(AssistantMessage::getText);
}
```

Expand Down
9 changes: 5 additions & 4 deletions workshop/exercise-2.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,22 +36,23 @@ private AssistantMessage appendToHistory(AssistantMessage assistantMessage) {
In the `getResponse` method, modify the return statement to append the response content to the conversation history by using `stream().chatResponse()` and `appendToHistory` methods.

```java
return chatClient.prompt(prompt).stream()
return chatClient.prompt(prompt)
.options(options)
.stream()
.chatResponse().toStream()
.map(ChatResponse::getResults)
.flatMap(List::stream)
.map(Generation::getOutput)
.map(this::appendToHistory)
.map(AssistantMessage::getContent);
.map(AssistantMessage::getText);
```

### Part 4 - Pass conversation history as context

In the `getResponse`, add `history` list content to existing `messages` list (between system and user messages).
In the `getResponse`, add `history` list content to existing `messages` list (before user message).

```java
List<Message> messages = new ArrayList<>();
messages.add(systemMessage);
messages.addAll(history);
messages.add(userMessage);
```
Expand Down
19 changes: 11 additions & 8 deletions workshop/exercise-4.md
Original file line number Diff line number Diff line change
Expand Up @@ -103,8 +103,9 @@ And instantiate `promptTemplate` attribute.

```java
public RAGService(ChatClient.Builder builder, RAGDataService dataService, @Value("classpath:/prompt-system.md") Resource promptSystem) {
this.systemMessage = new SystemMessage(promptSystem);
this.chatClient = builder.build();
this.chatClient = builder
.defaultSystem(promptSystem)
.build();
this.dataService = dataService;
promptTemplate = new PromptTemplate("""
Context:
Expand Down Expand Up @@ -133,19 +134,21 @@ Message message = promptTemplate.createMessage(Map.of("context", context, "quest
3. Call the LLM with this block of code and return the response.

```java
Prompt prompt = new Prompt(List.of(systemMessage, message),
OllamaOptions.create()
.withModel("mistral:7b")
.withTemperature(0.9));
Prompt prompt = new Prompt(message);
OllamaOptions options = OllamaOptions.builder()
.model("mistral:7b")
.temperature(0.9)
.build();

System.out.println("Preparing the answer...");

return chatClient.prompt(prompt).stream()
return chatClient.prompt(prompt).options(options)
.stream()
.chatResponse().toStream()
.map(ChatResponse::getResults)
.flatMap(List::stream)
.map(Generation::getOutput)
.map(AssistantMessage::getContent);
.map(AssistantMessage::getText);
```

## Solution
Expand Down
24 changes: 13 additions & 11 deletions workshop/solution/exercise-1/LLMService.java
Original file line number Diff line number Diff line change
Expand Up @@ -23,30 +23,32 @@
public class LLMService {

private final ChatClient chatClient;
private final SystemMessage systemMessage;
private final OllamaOptions options;

public LLMService(ChatClient.Builder builder, @Value("classpath:/prompt-system.md") Resource promptSystem) {
this.systemMessage = new SystemMessage(promptSystem);
this.chatClient = builder.build();
this.options = OllamaOptions.create()
.withModel("mistral:7b")
.withTemperature(0.8);
this.chatClient = builder
.defaultSystem(promptSystem)
.build();
this.options = OllamaOptions.builder()
.model("mistral:7b")
.temperature(0.8)
.build();
}

private Stream<String> getResponse(final Message userMessage) {

List<Message> messages = new ArrayList<>();
messages.add(systemMessage);
messages.add(userMessage);

Prompt prompt = new Prompt(messages, options);
return chatClient.prompt(prompt).stream()

Prompt prompt = new Prompt(messages);
return chatClient.prompt(prompt)
.options(options)
.stream()
.chatResponse().toStream()
.map(ChatResponse::getResults)
.flatMap(List::stream)
.map(Generation::getOutput)
.map(AssistantMessage::getContent);
.map(AssistantMessage::getText);
}

public Stream<String> askQuestion(final String question) {
Expand Down
24 changes: 13 additions & 11 deletions workshop/solution/exercise-2/LLMService.java
Original file line number Diff line number Diff line change
Expand Up @@ -23,34 +23,36 @@
public class LLMService {

private final ChatClient chatClient;
private final SystemMessage systemMessage;
private final OllamaOptions options;
private final List<Message> history;

public LLMService(ChatClient.Builder builder, @Value("classpath:/prompt-system.md") Resource promptSystem) {
this.systemMessage = new SystemMessage(promptSystem);
this.chatClient = builder.build();
this.options = OllamaOptions.create()
.withModel("mistral:7b")
.withTemperature(0.8);
this.chatClient = builder
.defaultSystem(promptSystem)
.build();
this.options = OllamaOptions.builder()
.model("mistral:7b")
.temperature(0.8)
.build();
this.history = new ArrayList<>();
}

private Stream<String> getResponse(final Message userMessage) {

List<Message> messages = new ArrayList<>();
messages.add(systemMessage);
messages.addAll(history);
messages.add(userMessage);

Prompt prompt = new Prompt(messages, options);
return chatClient.prompt(prompt).stream()

Prompt prompt = new Prompt(messages);
return chatClient.prompt(prompt)
.options(options)
.stream()
.chatResponse().toStream()
.map(ChatResponse::getResults)
.flatMap(List::stream)
.map(Generation::getOutput)
.map(this::appendToHistory)
.map(AssistantMessage::getContent);
.map(AssistantMessage::getText);
}

public Stream<String> askQuestion(final String question) {
Expand Down
22 changes: 12 additions & 10 deletions workshop/solution/exercise-3/LLMService.java
Original file line number Diff line number Diff line change
Expand Up @@ -23,18 +23,19 @@
public class LLMService {

private final ChatClient chatClient;
private final SystemMessage systemMessage;
private final OllamaOptions options;
private final List<Message> history;
private final DataService dataService;
private final PromptTemplate userPromptTemplate;

public LLMService(ChatClient.Builder builder, @Value("classpath:/prompt-system.md") Resource promptSystem, DataService dataService) {
this.systemMessage = new SystemMessage(promptSystem);
this.chatClient = builder.build();
this.options = OllamaOptions.create()
.withModel("mistral:7b")
.withTemperature(0.8);
this.chatClient = builder
.defaultSystem(promptSystem)
.build();
this.options = OllamaOptions.builder()
.model("mistral:7b")
.temperature(0.8)
.build();
this.history = new ArrayList<>();
this.dataService = dataService;
this.userPromptTemplate = new PromptTemplate("""
Expand All @@ -49,18 +50,19 @@ public LLMService(ChatClient.Builder builder, @Value("classpath:/prompt-system.m
private Stream<String> getResponse(final Message userMessage) {

List<Message> messages = new ArrayList<>();
messages.add(systemMessage);
messages.addAll(history);
messages.add(userMessage);

Prompt prompt = new Prompt(messages, options);
return chatClient.prompt(prompt).stream()
Prompt prompt = new Prompt(messages);
return chatClient.prompt(prompt)
.options(options)
.stream()
.chatResponse().toStream()
.map(ChatResponse::getResults)
.flatMap(List::stream)
.map(Generation::getOutput)
.map(this::appendToHistory)
.map(AssistantMessage::getContent);
.map(AssistantMessage::getText);
}

public Stream<String> askQuestion(final String question) {
Expand Down
20 changes: 11 additions & 9 deletions workshop/solution/exercise-4/RAGService.java
Original file line number Diff line number Diff line change
Expand Up @@ -23,11 +23,11 @@ public class RAGService {
private RAGDataService dataService;
private ChatClient chatClient;
private PromptTemplate promptTemplate;
private final SystemMessage systemMessage;

public RAGService(ChatClient.Builder builder, @Value("classpath:/prompt-system.md") Resource promptSystem, RAGDataService dataService) {
this.systemMessage = new SystemMessage(promptSystem);
this.chatClient = builder.build();
this.chatClient = builder
.defaultSystem(promptSystem)
.build();
this.dataService = dataService;
promptTemplate = new PromptTemplate("""
Answer the question based on this context:
Expand All @@ -44,19 +44,21 @@ public Stream<String> getResponse(final String question) {

Message message = promptTemplate.createMessage(Map.of("context", context, "question", question));

Prompt prompt = new Prompt(List.of(systemMessage, message),
OllamaOptions.create()
.withModel("mistral:7b")
.withTemperature(0.9));
Prompt prompt = new Prompt(message);
OllamaOptions options = OllamaOptions.builder()
.model("mistral:7b")
.temperature(0.9)
.build();

System.out.println("Preparing the answer...");

return chatClient.prompt(prompt).stream()
return chatClient.prompt(prompt).options(options)
.stream()
.chatResponse().toStream()
.map(ChatResponse::getResults)
.flatMap(List::stream)
.map(Generation::getOutput)
.map(AssistantMessage::getContent);
.map(AssistantMessage::getText);
}

}
Loading