Skip to content

Commit ba5b4c6

Browse files
committed
Version 1.2.0 🔥
1 parent 61dd9be commit ba5b4c6

File tree

3 files changed

+8
-8
lines changed

3 files changed

+8
-8
lines changed

README.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# OpenAI Scala Client 🤖
2-
[![version](https://img.shields.io/badge/version-1.1.2-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT) ![GitHub Stars](https://img.shields.io/github/stars/cequence-io/openai-scala-client?style=social) [![Twitter Follow](https://img.shields.io/twitter/follow/0xbnd?style=social)](https://twitter.com/0xbnd) ![GitHub CI](https://github.com/cequence-io/openai-scala-client/actions/workflows/continuous-integration.yml/badge.svg)
2+
[![version](https://img.shields.io/badge/version-1.2.0-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT) ![GitHub Stars](https://img.shields.io/github/stars/cequence-io/openai-scala-client?style=social) [![Twitter Follow](https://img.shields.io/twitter/follow/0xbnd?style=social)](https://twitter.com/0xbnd) ![GitHub CI](https://github.com/cequence-io/openai-scala-client/actions/workflows/continuous-integration.yml/badge.svg)
33

44
This is a no-nonsense async Scala client for OpenAI API supporting all the available endpoints and params **including streaming**, the newest **chat completion**, **responses API**, **assistants API**, **tools**, **vision**, and **voice routines** (as defined [here](https://platform.openai.com/docs/api-reference)), provided in a single, convenient service called [OpenAIService](./openai-core/src/main/scala/io/cequence/openaiscala/service/OpenAIService.scala). The supported calls are:
55

@@ -72,7 +72,7 @@ The currently supported Scala versions are **2.12, 2.13**, and **3**.
7272
To install the library, add the following dependency to your *build.sbt*
7373

7474
```
75-
"io.cequence" %% "openai-scala-client" % "1.1.2"
75+
"io.cequence" %% "openai-scala-client" % "1.2.0"
7676
```
7777

7878
or to *pom.xml* (if you use maven)
@@ -81,11 +81,11 @@ or to *pom.xml* (if you use maven)
8181
<dependency>
8282
<groupId>io.cequence</groupId>
8383
<artifactId>openai-scala-client_2.12</artifactId>
84-
<version>1.1.2</version>
84+
<version>1.2.0</version>
8585
</dependency>
8686
```
8787

88-
If you want streaming support, use `"io.cequence" %% "openai-scala-client-stream" % "1.1.2"` instead.
88+
If you want streaming support, use `"io.cequence" %% "openai-scala-client-stream" % "1.2.0"` instead.
8989

9090
## Config ⚙️
9191

google-gemini-client/src/main/scala/io/cequence/openaiscala/gemini/service/GeminiServiceFactory.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ object GeminiServiceFactory extends GeminiServiceConsts with EnvHelper {
3232
* Create a new instance of the [[OpenAIChatCompletionService]] wrapping the SonarService
3333
*
3434
* @param apiKey
35-
* The API key to use for authentication (if not specified the SONAR_API_KEY env. variable
35+
* The API key to use for authentication (if not specified the GOOGLE_API_KEY env. variable
3636
* will be used)
3737
* @param timeouts
3838
* The explicit timeouts to use for the service (optional)

openai-count-tokens/README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# OpenAI Scala Client - Count tokens [![version](https://img.shields.io/badge/version-1.1.2-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT)
1+
# OpenAI Scala Client - Count tokens [![version](https://img.shields.io/badge/version-1.2.0-green.svg)](https://cequence.io) [![License](https://img.shields.io/badge/License-MIT-lightgrey.svg)](https://opensource.org/licenses/MIT)
22

33
This module provides ability for estimating the number of tokens an OpenAI chat completion request will use.
44
Note that the full project documentation can be found [here](../README.md).
@@ -12,7 +12,7 @@ The currently supported Scala versions are **2.12, 2.13**, and **3**.
1212
To pull the library you have to add the following dependency to your *build.sbt*
1313

1414
```
15-
"io.cequence" %% "openai-scala-count-tokens" % "1.1.0"
15+
"io.cequence" %% "openai-scala-count-tokens" % "1.2.0"
1616
```
1717

1818
or to *pom.xml* (if you use maven)
@@ -21,7 +21,7 @@ or to *pom.xml* (if you use maven)
2121
<dependency>
2222
<groupId>io.cequence</groupId>
2323
<artifactId>openai-scala-count-tokens_2.12</artifactId>
24-
<version>1.1.2</version>
24+
<version>1.2.0</version>
2525
</dependency>
2626
```
2727

0 commit comments

Comments
 (0)