Skip to content

Remove packing and default batch size from FT cli #60

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Dec 21, 2021

Conversation

mwu1993
Copy link
Collaborator

@mwu1993 mwu1993 commented Dec 21, 2021

No description provided.

@mwu1993 mwu1993 changed the title Remove packing and batch size from FT cli Remove packing and default batch size from FT cli Dec 21, 2021
@mwu1993 mwu1993 merged commit 26fbacb into main Dec 21, 2021
@mwu1993 mwu1993 deleted the public/remove_packing branch December 21, 2021 20:24
classicvalues pushed a commit to classicvalues/openai-python that referenced this pull request Jan 7, 2022
* Add a codex backtranslation example to improve SQL queries (openai#58)

* Add a codex backtranslation example to improve SQL queries

* Boris update ft example (openai#57)

* update fine-tune example to show the new CLI outputs

* model specifiction for search (openai#60)

* Catch chunked encoding errors and retry (openai#63)

* Add batch suggestion logic to prepare_data for fine_tunes and custom Q&A answers logic (openai#62)

* Add batch suggestion logic to prepare_data for fine_tunes; add an example of how to create a rudimentary answers endpoint with a custom Q&A model

Co-authored-by: Madeleine Thompson <[email protected]>
Co-authored-by: hallacy <[email protected]>
cgayapr pushed a commit to cgayapr/openai-python that referenced this pull request Dec 14, 2024
* Add a codex backtranslation example to improve SQL queries (openai#58)

* Add a codex backtranslation example to improve SQL queries

* Boris update ft example (openai#57)

* update fine-tune example to show the new CLI outputs

* model specifiction for search (openai#60)

* Catch chunked encoding errors and retry (openai#63)

* Add batch suggestion logic to prepare_data for fine_tunes and custom Q&A answers logic (openai#62)

* Add batch suggestion logic to prepare_data for fine_tunes; add an example of how to create a rudimentary answers endpoint with a custom Q&A model

Co-authored-by: Madeleine Thompson <[email protected]>
Co-authored-by: hallacy <[email protected]>
cgayapr pushed a commit to cgayapr/openai-python that referenced this pull request Dec 14, 2024
safa0 pushed a commit to safa0/openai-agents-python that referenced this pull request Apr 27, 2025
…enai#60)

This PR introduces a `strict_mode: bool = True` option to
`@function_tool`, allowing optional parameters when set to False. This
change enables more flexibility while maintaining strict JSON schema
validation by default.

resolves openai#43 

## Changes:

- Added `strict_mode` parameter to `@function_tool` and passed it to
`function_schema` and `FunctionTool`.
- Updated `function_schema.py` to respect `strict_mode` and allow
optional parameters when set to False.
- Added unit tests to verify optional parameters work correctly,
including multiple optional params with different types.

## Tests:

- Verified function calls with missing optional parameters behave as
expected.
- Added async tests to validate behavior under different configurations.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants