Skip to content

Commit

Permalink
Version 1.1.1
Browse files Browse the repository at this point in the history
  • Loading branch information
peterbanda committed Dec 2, 2024
1 parent 360fde7 commit 4a49256
Show file tree
Hide file tree
Showing 4 changed files with 10 additions and 10 deletions.
8 changes: 4 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# OpenAI Scala Client 🤖
[![version](https://img.shields.io/badge/version-1.1.0-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT) ![GitHub Stars](https://img.shields.io/github/stars/cequence-io/openai-scala-client?style=social) [![Twitter Follow](https://img.shields.io/twitter/follow/0xbnd?style=social)](https://twitter.com/0xbnd) ![GitHub CI](https://github.com/cequence-io/openai-scala-client/actions/workflows/continuous-integration.yml/badge.svg)
[![version](https://img.shields.io/badge/version-1.1.1-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT) ![GitHub Stars](https://img.shields.io/github/stars/cequence-io/openai-scala-client?style=social) [![Twitter Follow](https://img.shields.io/twitter/follow/0xbnd?style=social)](https://twitter.com/0xbnd) ![GitHub CI](https://github.com/cequence-io/openai-scala-client/actions/workflows/continuous-integration.yml/badge.svg)

This is a no-nonsense async Scala client for OpenAI API supporting all the available endpoints and params **including streaming**, the newest **chat completion**, **vision**, and **voice routines** (as defined [here](https://beta.openai.com/docs/api-reference)), provided in a single, convenient service called [OpenAIService](./openai-core/src/main/scala/io/cequence/openaiscala/service/OpenAIService.scala). The supported calls are:

Expand Down Expand Up @@ -63,7 +63,7 @@ The currently supported Scala versions are **2.12, 2.13**, and **3**.
To install the library, add the following dependency to your *build.sbt*

```
"io.cequence" %% "openai-scala-client" % "1.1.0"
"io.cequence" %% "openai-scala-client" % "1.1.1"
```

or to *pom.xml* (if you use maven)
Expand All @@ -72,11 +72,11 @@ or to *pom.xml* (if you use maven)
<dependency>
<groupId>io.cequence</groupId>
<artifactId>openai-scala-client_2.12</artifactId>
<version>1.1.0</version>
<version>1.1.1</version>
</dependency>
```

If you want streaming support, use `"io.cequence" %% "openai-scala-client-stream" % "1.1.0"` instead.
If you want streaming support, use `"io.cequence" %% "openai-scala-client-stream" % "1.1.1"` instead.

## Config ⚙️

Expand Down
2 changes: 1 addition & 1 deletion build.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ val scala3 = "3.2.2"

ThisBuild / organization := "io.cequence"
ThisBuild / scalaVersion := scala212
ThisBuild / version := "1.1.1.RC.17"
ThisBuild / version := "1.1.1"
ThisBuild / isSnapshot := false

lazy val commonSettings = Seq(
Expand Down
6 changes: 3 additions & 3 deletions openai-core/README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# OpenAI Scala Client - Core [![version](https://img.shields.io/badge/version-1.0.0-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT)
# OpenAI Scala Client - Core [![version](https://img.shields.io/badge/version-1.1.1-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT)

This is the core module, which contains mostly domain classes and the [OpenAIService](./src/main/scala/io/cequence/openaiscala/service/OpenAIService.scala) definition.
Note that the full project documentation can be found [here](../README.md).
Expand All @@ -10,7 +10,7 @@ The currently supported Scala versions are **2.12, 2.13**, and **3**.
To pull the library you have to add the following dependency to your *build.sbt*

```
"io.cequence" %% "openai-scala-core" % "1.0.0"
"io.cequence" %% "openai-scala-core" % "1.1.1"
```

or to *pom.xml* (if you use maven)
Expand All @@ -19,6 +19,6 @@ or to *pom.xml* (if you use maven)
<dependency>
<groupId>io.cequence</groupId>
<artifactId>openai-scala-core_2.12</artifactId>
<version>1.0.0</version>
<version>1.1.1</version>
</dependency>
```
4 changes: 2 additions & 2 deletions openai-count-tokens/README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# OpenAI Scala Client - Count tokens [![version](https://img.shields.io/badge/version-1.0.0-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT)
# OpenAI Scala Client - Count tokens [![version](https://img.shields.io/badge/version-1.1.1-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT)

This module provides ability for estimating the number of tokens an OpenAI chat completion request will use.
Note that the full project documentation can be found [here](../README.md).
Expand All @@ -21,7 +21,7 @@ or to *pom.xml* (if you use maven)
<dependency>
<groupId>io.cequence</groupId>
<artifactId>openai-scala-count-tokens_2.12</artifactId>
<version>1.0.0</version>
<version>1.1.1</version>
</dependency>
```

Expand Down

0 comments on commit 4a49256

Please sign in to comment.