-
-
Notifications
You must be signed in to change notification settings - Fork 102
Support Complex Parameter Schemas #124
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Well tested and documented. Compatible with my fork - appreciate it!
5392f9a
to
456ecbb
Compare
Nice one! I like it. The only issue is I really need support for Gemini myself, so I cannot use this one yet in production. I'm happy to close mine or merge any elements. I'll let @crmne know what he wants. My goal is to use RubyLLM for all production tasks, and I need structured outputs for Claude, Gemini, and OpenAI. Using the prompt method works very well. |
@kieranklaassen sounds great. Yeah let's wait to hear from @crmne first then I can merge your code for Gemini support into my PR since that one is hardest to support if he decides to go that direction. |
456ecbb
to
9e03cd8
Compare
Are you guys commenting on the wrong PR or is it just me? |
@tpaulshippy not just you - comments went to wrong PR. Those 2 comments should be on @kieranklaassen 's PR for structured output |
…g the existing simple methods + bump version to 1.3 since this is kind of a major feature.
30e2eb6
to
f4e1c47
Compare
Summary
Adds the ability to define more complex param schemas, while retaining the existing simple methods.
This new format is also forward-compatible once LLMs start supporting more JSON schema structures (and at the pace they are moving that might not be too long from now).
Bumped version to 1.3 since this is kind of a major feature.
Examples
How I tested
I tested all the supported LLM APIs directly.
Shout-outs