Skip to content

Commit

Permalink
Merge branch 'main' into main
Browse files Browse the repository at this point in the history
  • Loading branch information
psilberk authored Jan 8, 2025
2 parents 9623af1 + 205e53a commit 83445f8
Show file tree
Hide file tree
Showing 30 changed files with 1,465 additions and 939 deletions.
43 changes: 43 additions & 0 deletions docs/docs/integrations/embedding-models/xinference.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
---
sidebar_position: 20
---

# Xinference

- https://inference.readthedocs.io/


## Maven Dependency

`0.37.0` and later:

```xml
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-community-xinference</artifactId>
<version>0.37.0</version>
</dependency>
```

Or, you can use BOM to manage dependencies consistently:

```xml
<dependencyManagement>
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-community-bom</artifactId>
<version>0.37.0</version>
<typ>pom</typ>
<scope>import</scope>
</dependency>
</dependencyManagement>
```

## APIs

- `XinferenceEmbeddingModel`


## Examples

- [XinferenceEmbeddingModelIT](https://github.com/langchain4j/langchain4j-community/blob/main/models/langchain4j-community-xinference/src/test/java/dev/langchain4j/community/model/xinference/XinferenceEmbeddingModelIT.java)
43 changes: 43 additions & 0 deletions docs/docs/integrations/image-models/xinference.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
---
sidebar_position: 6
---

# Xinference

- https://inference.readthedocs.io/


## Maven Dependency

`0.37.0` and later:

```xml
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-community-xinference</artifactId>
<version>0.37.0</version>
</dependency>
```

Or, you can use BOM to manage dependencies consistently:

```xml
<dependencyManagement>
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-community-bom</artifactId>
<version>0.37.0</version>
<typ>pom</typ>
<scope>import</scope>
</dependency>
</dependencyManagement>
```

## APIs

- `XinferenceImageModel`


## Examples

- [XinferenceImageModelIT](https://github.com/langchain4j/langchain4j-community/blob/main/models/langchain4j-community-xinference/src/test/java/dev/langchain4j/community/model/xinference/XinferenceImageModelIT.java)
46 changes: 46 additions & 0 deletions docs/docs/integrations/language-models/xinference.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
---
sidebar_position: 19
---

# Xinference

- https://inference.readthedocs.io/


## Maven Dependency

`0.37.0` and later:

```xml
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-community-xinference</artifactId>
<version>0.37.0</version>
</dependency>
```

Or, you can use BOM to manage dependencies consistently:

```xml
<dependencyManagement>
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-community-bom</artifactId>
<version>0.37.0</version>
<typ>pom</typ>
<scope>import</scope>
</dependency>
</dependencyManagement>
```


## APIs

- `XinferenceChatModel`
- `XinferenceStreamingChatModel`


## Examples

- [XinferenceChatModelIT](https://github.com/langchain4j/langchain4j-community/blob/main/models/langchain4j-community-xinference/src/test/java/dev/langchain4j/community/model/xinference/XinferenceChatModelIT.java)
- [XinferenceStreamingChatModelIT](https://github.com/langchain4j/langchain4j-community/blob/main/models/langchain4j-community-xinference/src/test/java/dev/langchain4j/community/model/xinference/XinferenceStreamingChatModelIT.java)
43 changes: 43 additions & 0 deletions docs/docs/integrations/scoring-reranking-models/xinference.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
---
sidebar_position: 6
---

# Xinference

- https://inference.readthedocs.io/


## Maven Dependency

`0.37.0` and later:

```xml
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-community-xinference</artifactId>
<version>0.37.0</version>
</dependency>
```

Or, you can use BOM to manage dependencies consistently:

```xml
<dependencyManagement>
<dependency>
<groupId>dev.langchain4j</groupId>
<artifactId>langchain4j-community-bom</artifactId>
<version>0.37.0</version>
<typ>pom</typ>
<scope>import</scope>
</dependency>
</dependencyManagement>
```

## APIs

- `XinferenceScoringModel`


## Examples

- [XinferenceScoringModelIT](https://github.com/langchain4j/langchain4j-community/blob/main/models/langchain4j-community-xinference/src/test/java/dev/langchain4j/community/model/xinference/XinferenceScoringModelIT.java)
2 changes: 1 addition & 1 deletion docs/docs/tutorials/2-chat-memory.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ Currently, LangChain4j offers 2 out-of-the-box implementations:
which also operates as a sliding window but focuses on keeping the `N` most recent **tokens**,
evicting older messages as needed.
Messages are indivisible. If a message doesn't fit, it is evicted completely.
`MessageWindowChatMemory` requires a `Tokenizer` to count the tokens in each `ChatMessage`.
`TokenWindowChatMemory` requires a `Tokenizer` to count the tokens in each `ChatMessage`.

## Persistence

Expand Down
5 changes: 5 additions & 0 deletions docs/docs/tutorials/5-ai-services.md
Original file line number Diff line number Diff line change
Expand Up @@ -780,5 +780,10 @@ I can evaluate both of them separately and find the most optimal parameters for
or, in the long run, even fine-tune a small specialized model for each specific subtask.


## Testing

- [An example of integration testing for a Customer Support Agent](https://github.com/langchain4j/langchain4j-examples/blob/main/customer-support-agent-example/src/test/java/dev/langchain4j/example/CustomerSupportAgentIT.java)


## Related Tutorials
- [LangChain4j AiServices Tutorial](https://www.sivalabs.in/langchain4j-ai-services-tutorial/) by [Siva](https://www.sivalabs.in/)
4 changes: 4 additions & 0 deletions docs/docs/tutorials/spring-boot-integration.md
Original file line number Diff line number Diff line change
Expand Up @@ -224,6 +224,10 @@ interface Assistant {
For this, please import `langchain4j-reactor` module.
See more details [here](/tutorials/ai-services#flux).

## Testing

- [An example of integration testing for a Customer Support Agent](https://github.com/langchain4j/langchain4j-examples/blob/main/customer-support-agent-example/src/test/java/dev/langchain4j/example/CustomerSupportAgentIT.java)

## Supported versions

LangChain4j Spring Boot integration requires Java 17 and Spring Boot 3.2.
Expand Down
26 changes: 26 additions & 0 deletions docs/docs/tutorials/testing-and-evaluation.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
---
sidebar_position: 33
---

# Testing and Evaluation

## Examples

[Here](https://github.com/langchain4j/langchain4j-examples/blob/main/customer-support-agent-example/src/test/java/dev/langchain4j/example/CustomerSupportAgentIT.java)
is an example of an integration testing for a Customer Support Agent.
This corresponds to [Level 1: Unit Tests](https://hamel.dev/blog/posts/evals/#level-1-unit-tests).

## Recommended Reading

- [Your AI Product Needs Evals](https://hamel.dev/blog/posts/evals/)
- [Creating a LLM-as-a-Judge That Drives Business Results](https://hamel.dev/blog/posts/llm-judge/)
- [A Practical Guide to RAG Pipeline Evaluation (Part 1: Retrieval)](https://medium.com/relari/a-practical-guide-to-rag-pipeline-evaluation-part-1-27a472b09893)
- [A Practical Guide to RAG Pipeline Evaluation (Part 2: Generation)](https://medium.com/relari/a-practical-guide-to-rag-evaluation-part-2-generation-c79b1bde0f5d)
- [How important is a Golden Dataset for LLM evaluation?](https://medium.com/relari/how-important-is-a-golden-dataset-for-llm-pipeline-evaluation-4ef6deb14dc5)
- [Case Study: Reference-free vs Reference-based evaluation of RAG pipeline](https://medium.com/relari/case-study-reference-free-vs-reference-based-evaluation-of-rag-pipeline-9a49ef49866c)
- [How to evaluate complex GenAI Apps: a granular approach](https://medium.com/relari/how-to-evaluate-complex-genai-apps-a-granular-approach-0ab929d5b3e2)
- [Generate Synthetic Data to Test LLM Applications](https://medium.com/relari/generate-synthetic-data-to-test-llm-applications-4bffeb51b80e)

:::note
More information coming soon.
:::
Loading

0 comments on commit 83445f8

Please sign in to comment.