Skip to content
This repository was archived by the owner on Jun 6, 2024. It is now read-only.

Added "function" capabilities #298

Merged
merged 24 commits into from
Jun 22, 2023
Merged

Added "function" capabilities #298

merged 24 commits into from
Jun 22, 2023

Conversation

Eduardo-T
Copy link
Contributor

This fork adds the new function capabilities. Check it out.

@AESIRZZW
Copy link

@TheoKanning Help review this change when you have time, thanks in advance.

Copy link

@Andrewcpu Andrewcpu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not a full review, but the unnecessary changes stemming from the local maven repo tutorial should be removed..
edit: and that weird dependency is sus to me.

api/build.gradle Outdated
@@ -2,7 +2,7 @@ apply plugin: 'java-library'
apply plugin: "com.vanniktech.maven.publish"

dependencies {
implementation 'com.fasterxml.jackson.core:jackson-annotations:2.9.0'
api 'com.fasterxml.jackson.core:jackson-annotations:2.9.0'

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why does this have to change?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is also being used by the 'service' module.

@@ -5,7 +5,8 @@ dependencies {
api project(":client")
api 'com.squareup.retrofit2:retrofit:2.9.0'
implementation 'com.squareup.retrofit2:adapter-rxjava2:2.9.0'
implementation 'com.squareup.retrofit2:converter-jackson:2.9.0'
implementation 'com.squareup.retrofit2:converter-jackson:2.7.2'
implementation 'com.kjetland:mbknor-jackson-jsonschema_2.13:1.0.39'

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

?

Copy link

@Andrewcpu Andrewcpu Jun 15, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This build of com.kjetland:mbknor-jackson-jsonschema_2.13:1.0.39 is very vulnerable and sus?
Why would you downgrade the jackson converter to 2.7.2?
https://mvnrepository.com/artifact/com.kjetland/mbknor-jackson-jsonschema_2.13/1.0.39

^ This package is unmaintained and vulnerable

@Eduardo-T
Copy link
Contributor Author

Thanks @Andrewcpu,

The dependency downgrade was a mistake, I fixed it. And mbknor-jackson-jsonschema_2.13:1.0.39 was changed to mbknor-jackson-jsonschema_2.12:1.0.39 fixing its vulnerabilities. It is needed for the API to work (as seen here). But if we are going to talk about vulnerabilities, com.squareup.retrofit2:converter-jackson:2.9.0 already has some.

Feel free to update anything you need when doing the pull. I'm not a GitHub expert, but from what I know I think you can decide what to include and what not. I changed the .md and added the sections you commented with '?', so that people looking at my fork could install it on their own computer and use it instantly. You can leave only the necessary changes in the code.

If that's not possible, let me know, and I'll help, once again, with the necessary changes.

@Eduardo-T
Copy link
Contributor Author

Hello again @Andrewcpu,

After doing some research, I found that you could indeed make some changes, but it was tedious. So I already made them myself.

I've also added information on how to use the new feature in the README. Tell me what you think.

@Grogdunn
Copy link
Contributor

@Eduardo-T check this out optionfactory@024ecb9

on the documentation function_call (on top level) should be a string ["auto", "none"] ]or an object (containing a string 😮‍💨 )

Controls how the model responds to function calls. "none" means the model does not call a function, and responds to the end-user. "auto" means the model can pick between an end-user or calling a function. Specifying a particular function via {"name":\ "my_function"} forces the model to call that function. "none" is the default when no functions are present. "auto" is the default if functions are present.

@Eduardo-T
Copy link
Contributor Author

@Eduardo-T check this out optionfactory@024ecb9

on the documentation function_call (on top level) should be a string ["auto", "none"] ]or an object (containing a string 😮‍💨 )

Controls how the model responds to function calls. "none" means the model does not call a function, and responds to the end-user. "auto" means the model can pick between an end-user or calling a function. Specifying a particular function via {"name":\ "my_function"} forces the model to call that function. "none" is the default when no functions are present. "auto" is the default if functions are present.

Good eye! I have merged the changes. Thanks for noticing and fixing it.

@Grogdunn
Copy link
Contributor

Thx @Eduardo-T ported your changes to my fork 👍

…ect mapper which is required when using the library in Kotlin
@lumiseven
Copy link

@Eduardo-T
Thank you for your pr, I tried it and it works very well.
But if I use Stream mode(Stream=True), the type of choices.0.delta.function_call.arguments is in plain text format, so it would be better to use StringSerializer and StringDeserializer for the getArguments method in ChatFunctionCallMixIn?

@davidb-e4s
Copy link
Contributor

Just opened a PR to allow your changes to work with Kotlin @Eduardo-T

@TheoKanning
Copy link
Owner

@Eduardo-T Thanks for doing this! I merged another PR that created some small conflicts, and can you add a basic test for this? Thanks again

@Eduardo-T
Copy link
Contributor Author

@Eduardo-T
Thank you for your pr, I tried it and it works very well.
But if I use Stream mode(Stream=True), the type of choices.0.delta.function_call.arguments is in plain text format, so it would be better to use StringSerializer and StringDeserializer for the getArguments method in ChatFunctionCallMixIn?

Thanks! Yes, I am currently working on an update that allows stream completions. I will be sharing it soon.

Allowing ObjectMapper to be overridden (for kotlin to work)
@Eduardo-T
Copy link
Contributor Author

Eduardo-T commented Jun 19, 2023

@Eduardo-T
Thank you for your pr, I tried it and it works very well.
But if I use Stream mode(Stream=True), the type of choices.0.delta.function_call.arguments is in plain text format, so it would be better to use StringSerializer and StringDeserializer for the getArguments method in ChatFunctionCallMixIn?

I implemented the solution. You can refer to "OpenAiApiFunctionsWIthStreamExample.java" for an example.

Basically, for practical purposes, the stream can be mapped to a "ChatMessageAccumulator" which will append the function's and message's chunks, bringing everything together without you having to deal with it manually.

The changes now also take care of the "plain text format" issue you mentioned, as each chunk is converted to a TextNode, and when finished, the accumulated version is converted to a JsonNode which is most likely an ObjectNode. In general, you shouldn't even care about this, unless you're creating your own executor or something.

So my general recommendation would be to have a FunctionExecutor, and let it do the dirty work for you. For example:

First, configure the executor of your function:

functionBuilder.executor(WeatherRequest.class, w -> new WeatherResponse(w.location, w.unit, 25, "sunny"));

Then, use FunctionExecutor to get the response:

WeatherResponse functionExecutionResponse = functionExecutor.execute(chatFunctionCall);

Or even better:

ChatMessage callResponse = functionExecutor.executeAndConvertToMessageHandlingExceptions(chatFunctionCall);

That way you can just add it to the conversation right away without even worrying about exceptions.

@Eduardo-T
Copy link
Contributor Author

Eduardo-T commented Jun 20, 2023

@Eduardo-T Thanks for doing this! I merged another PR that created some small conflicts, and can you add a basic test for this? Thanks again

No problem! It is done.

It now supports functions, functions with streaming, has tests and some documentation on some classes and also information in the README.md.

@TheoKanning

@shaun-wild
Copy link

I'm a fan of the function executing solution you've come up with here, but personally I don't feel like it fits within the premise of the openai-java wrapper, it feels to me like its doing too much and would be better suited as a separate library.

@Eduardo-T
Copy link
Contributor Author

Eduardo-T commented Jun 20, 2023

I'm a fan of the function executing solution you've come up with here, but personally I don't feel like it fits within the premise of the openai-java wrapper, it feels to me like its doing too much and would be better suited as a separate library.

Thank you!

I don't think it's much really, everything added is necessary for the optimal and easy working of the new functions feature, and it's inside the 'service' module.

I've seen other libraries where you have to manually write your schema objects for your functions, and it's horrible. I didn't want that to happen to this library, it is characterized by being nice and simple to use.

The only classes that could stand out, and I suppose they are the ones you mention being a fan of, would be FunctionExecutor and ChatMessageAccumulator, both optional but highly recommended, since they allow the new mechanics to be implemented quite well, with a similar simplicity to other programming languages. But really that's all.

So there aren't really any additions other than necessary changes, like api objects, serializers and deserializers, etc.

Adding one more module to the project I think would add complexity, although I am interested in your proposal, could you give more details?

@aaronuu
Copy link
Contributor

aaronuu commented Jun 21, 2023

I would like to know when it can be merged into the main line

@TheoKanning

@TheoKanning TheoKanning merged commit c24e78b into TheoKanning:main Jun 22, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

9 participants