Skip to content

Commit

Permalink
refactor(Client): Refactor client creation
Browse files Browse the repository at this point in the history
  • Loading branch information
clementgig committed Jan 15, 2025
1 parent 07bda76 commit 9a6fde8
Show file tree
Hide file tree
Showing 8 changed files with 67 additions and 59 deletions.
2 changes: 2 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,8 @@
## 2025-01-15

- OllamaOptions use .builder() instead of create()
- Refactor client creation (with options and system prompt)
author : [email protected]

## 2025-01-07

Expand Down
29 changes: 14 additions & 15 deletions workshop/exercise-1.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,18 +10,16 @@ Modify the `LLMService` class.

We will use a `ChatClient` object to interact with the LLM. This object can be built with `ChatClient.Builder` already instantiated thanks to autoconfiguration.

Create a private final attribute `ChatClient` named chatClient.
Create a private final attribute `SystemMessage` named systemMessage.

In the LLMService constructor, set chatClient with the result of calling `build()` on the builder.
Create a private final attribute `ChatClient` named `chatClient`.
In the LLMService constructor, set `chatClient` with the result of calling `.defaultSystem(promptSystem).build()` on `builder`.

```java
private final ChatClient chatClient;
private final SystemMessage systemMessage;

public LLMService(ChatClient.Builder builder, @Value("classpath:/prompt-system.md") Resource promptSystem) {
this.systemMessage = new SystemMessage(promptSystem);
this.chatClient = builder.build();
this.chatClient = builder
.defaultSystem(promptSystem)
.build();
}
```

Expand All @@ -35,7 +33,7 @@ Please answer the question asked and provide the shortest possible response with

### Part 3 - Create query options object

Create a `OllamaOptions` attribute and initialize it in the constructor by using `OllamaOptions.create()` method and set model to `mistral:7b` and temperature to `0.8`.
Create a `OllamaOptions` attribute and initialize it in the constructor by using `OllamaOptions.builder()` method and build with model `mistral:7b` and temperature `0.8`.

```java
this.options = OllamaOptions.builder()
Expand All @@ -48,24 +46,25 @@ this.options = OllamaOptions.builder()

Complete the existing `getResponse` method with the following steps:

1. create a new `Prompt` object using `Prompt(List<Message> messages, OllamaOptions options)` constructor. Pass the previously created objects as arguments, where SystemMessage and UserMessage are included in a list, along with the OllamaOptions object.
2. call `chatClient.stream` method by passing the `Prompt` object as argument
1. create a new `Prompt` object using `Prompt(List<Message> messages)` constructor. Pass the previously created object as argument.
2. call `chatClient.stream` method by passing the `Prompt` object and the `options` attribute in builder pattern
3. map and return `chatClient.stream` result

```java
private Stream<String> getResponse(final Message userMessage) {

List<Message> messages = new ArrayList<>();
messages.add(systemMessage);
messages.add(userMessage);

Prompt prompt = new Prompt(messages, options);
return chatClient.prompt(prompt).stream()

Prompt prompt = new Prompt(messages);
return chatClient.prompt(prompt)
.options(options)
.stream()
.chatResponse().toStream()
.map(ChatResponse::getResults)
.flatMap(List::stream)
.map(Generation::getOutput)
.map(AssistantMessage::getContent);
.map(AssistantMessage::getText);
}
```

Expand Down
9 changes: 5 additions & 4 deletions workshop/exercise-2.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,22 +36,23 @@ private AssistantMessage appendToHistory(AssistantMessage assistantMessage) {
In the `getResponse` method, modify the return statement to append the response content to the conversation history by using `stream().chatResponse()` and `appendToHistory` methods.

```java
return chatClient.prompt(prompt).stream()
return chatClient.prompt(prompt)
.options(options)
.stream()
.chatResponse().toStream()
.map(ChatResponse::getResults)
.flatMap(List::stream)
.map(Generation::getOutput)
.map(this::appendToHistory)
.map(AssistantMessage::getContent);
.map(AssistantMessage::getText);
```

### Part 4 - Pass conversation history as context

In the `getResponse`, add `history` list content to existing `messages` list (between system and user messages).
In the `getResponse`, add `history` list content to existing `messages` list (before user message).

```java
List<Message> messages = new ArrayList<>();
messages.add(systemMessage);
messages.addAll(history);
messages.add(userMessage);
```
Expand Down
20 changes: 11 additions & 9 deletions workshop/exercise-4.md
Original file line number Diff line number Diff line change
Expand Up @@ -103,8 +103,9 @@ And instantiate `promptTemplate` attribute.

```java
public RAGService(ChatClient.Builder builder, RAGDataService dataService, @Value("classpath:/prompt-system.md") Resource promptSystem) {
this.systemMessage = new SystemMessage(promptSystem);
this.chatClient = builder.build();
this.chatClient = builder
.defaultSystem(promptSystem)
.build();
this.dataService = dataService;
promptTemplate = new PromptTemplate("""
Context:
Expand Down Expand Up @@ -133,20 +134,21 @@ Message message = promptTemplate.createMessage(Map.of("context", context, "quest
3. Call the LLM with this block of code and return the response.

```java
Prompt prompt = new Prompt(List.of(systemMessage, message),
OllamaOptions.builder()
.model("mistral:7b")
.temperature(0.9)
.build());
Prompt prompt = new Prompt(message);
OllamaOptions options = OllamaOptions.builder()
.model("mistral:7b")
.temperature(0.9)
.build();

System.out.println("Preparing the answer...");

return chatClient.prompt(prompt).stream()
return chatClient.prompt(prompt).options(options)
.stream()
.chatResponse().toStream()
.map(ChatResponse::getResults)
.flatMap(List::stream)
.map(Generation::getOutput)
.map(AssistantMessage::getContent);
.map(AssistantMessage::getText);
```

## Solution
Expand Down
17 changes: 9 additions & 8 deletions workshop/solution/exercise-1/LLMService.java
Original file line number Diff line number Diff line change
Expand Up @@ -23,12 +23,12 @@
public class LLMService {

private final ChatClient chatClient;
private final SystemMessage systemMessage;
private final OllamaOptions options;

public LLMService(ChatClient.Builder builder, @Value("classpath:/prompt-system.md") Resource promptSystem) {
this.systemMessage = new SystemMessage(promptSystem);
this.chatClient = builder.build();
this.chatClient = builder
.defaultSystem(promptSystem)
.build();
this.options = OllamaOptions.builder()
.model("mistral:7b")
.temperature(0.8)
Expand All @@ -38,16 +38,17 @@ public LLMService(ChatClient.Builder builder, @Value("classpath:/prompt-system.m
private Stream<String> getResponse(final Message userMessage) {

List<Message> messages = new ArrayList<>();
messages.add(systemMessage);
messages.add(userMessage);

Prompt prompt = new Prompt(messages, options);
return chatClient.prompt(prompt).stream()

Prompt prompt = new Prompt(messages);
return chatClient.prompt(prompt)
.options(options)
.stream()
.chatResponse().toStream()
.map(ChatResponse::getResults)
.flatMap(List::stream)
.map(Generation::getOutput)
.map(AssistantMessage::getContent);
.map(AssistantMessage::getText);
}

public Stream<String> askQuestion(final String question) {
Expand Down
17 changes: 9 additions & 8 deletions workshop/solution/exercise-2/LLMService.java
Original file line number Diff line number Diff line change
Expand Up @@ -23,13 +23,13 @@
public class LLMService {

private final ChatClient chatClient;
private final SystemMessage systemMessage;
private final OllamaOptions options;
private final List<Message> history;

public LLMService(ChatClient.Builder builder, @Value("classpath:/prompt-system.md") Resource promptSystem) {
this.systemMessage = new SystemMessage(promptSystem);
this.chatClient = builder.build();
this.chatClient = builder
.defaultSystem(promptSystem)
.build();
this.options = OllamaOptions.builder()
.model("mistral:7b")
.temperature(0.8)
Expand All @@ -40,18 +40,19 @@ public LLMService(ChatClient.Builder builder, @Value("classpath:/prompt-system.m
private Stream<String> getResponse(final Message userMessage) {

List<Message> messages = new ArrayList<>();
messages.add(systemMessage);
messages.addAll(history);
messages.add(userMessage);

Prompt prompt = new Prompt(messages, options);
return chatClient.prompt(prompt).stream()

Prompt prompt = new Prompt(messages);
return chatClient.prompt(prompt)
.options(options)
.stream()
.chatResponse().toStream()
.map(ChatResponse::getResults)
.flatMap(List::stream)
.map(Generation::getOutput)
.map(this::appendToHistory)
.map(AssistantMessage::getContent);
.map(AssistantMessage::getText);
}

public Stream<String> askQuestion(final String question) {
Expand Down
15 changes: 8 additions & 7 deletions workshop/solution/exercise-3/LLMService.java
Original file line number Diff line number Diff line change
Expand Up @@ -23,15 +23,15 @@
public class LLMService {

private final ChatClient chatClient;
private final SystemMessage systemMessage;
private final OllamaOptions options;
private final List<Message> history;
private final DataService dataService;
private final PromptTemplate userPromptTemplate;

public LLMService(ChatClient.Builder builder, @Value("classpath:/prompt-system.md") Resource promptSystem, DataService dataService) {
this.systemMessage = new SystemMessage(promptSystem);
this.chatClient = builder.build();
this.chatClient = builder
.defaultSystem(promptSystem)
.build();
this.options = OllamaOptions.builder()
.model("mistral:7b")
.temperature(0.8)
Expand All @@ -50,18 +50,19 @@ public LLMService(ChatClient.Builder builder, @Value("classpath:/prompt-system.m
private Stream<String> getResponse(final Message userMessage) {

List<Message> messages = new ArrayList<>();
messages.add(systemMessage);
messages.addAll(history);
messages.add(userMessage);

Prompt prompt = new Prompt(messages, options);
return chatClient.prompt(prompt).stream()
Prompt prompt = new Prompt(messages);
return chatClient.prompt(prompt)
.options(options)
.stream()
.chatResponse().toStream()
.map(ChatResponse::getResults)
.flatMap(List::stream)
.map(Generation::getOutput)
.map(this::appendToHistory)
.map(AssistantMessage::getContent);
.map(AssistantMessage::getText);
}

public Stream<String> askQuestion(final String question) {
Expand Down
17 changes: 9 additions & 8 deletions workshop/solution/exercise-4/RAGService.java
Original file line number Diff line number Diff line change
Expand Up @@ -23,11 +23,11 @@ public class RAGService {
private RAGDataService dataService;
private ChatClient chatClient;
private PromptTemplate promptTemplate;
private final SystemMessage systemMessage;

public RAGService(ChatClient.Builder builder, @Value("classpath:/prompt-system.md") Resource promptSystem, RAGDataService dataService) {
this.systemMessage = new SystemMessage(promptSystem);
this.chatClient = builder.build();
this.chatClient = builder
.defaultSystem(promptSystem)
.build();
this.dataService = dataService;
promptTemplate = new PromptTemplate("""
Answer the question based on this context:
Expand All @@ -44,20 +44,21 @@ public Stream<String> getResponse(final String question) {

Message message = promptTemplate.createMessage(Map.of("context", context, "question", question));

Prompt prompt = new Prompt(List.of(systemMessage, message),
OllamaOptions.builder()
Prompt prompt = new Prompt(message);
OllamaOptions options = OllamaOptions.builder()
.model("mistral:7b")
.temperature(0.9)
.build());
.build();

System.out.println("Preparing the answer...");

return chatClient.prompt(prompt).stream()
return chatClient.prompt(prompt).options(options)
.stream()
.chatResponse().toStream()
.map(ChatResponse::getResults)
.flatMap(List::stream)
.map(Generation::getOutput)
.map(AssistantMessage::getContent);
.map(AssistantMessage::getText);
}

}

0 comments on commit 9a6fde8

Please sign in to comment.