-
Notifications
You must be signed in to change notification settings - Fork 3.8k
CLI erroneously sends unsupported parameters (temperature/top_p) to the o3-mini model #2072
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Thanks for the report! This will be fixed in the next release #2078 |
Thank you!! |
Hello. I still get the same error when setting the temperature with o3-mini. Do you have an estimate of when this might get fixed? |
For API level support please go to the community.openai.com forum. |
Hi, I think this same error happened on the python CLI not from the openAI API If you try to create an assistant with 4-o-mini model: And try to update the model to o3-mini: You will end up getting this error:
|
Confirm this is an issue with the Python library and not an underlying OpenAI API
Describe the bug
Using
openai api chat.completions.create
with the newly releasedo3-mini-2025-01-31
model triggers errors about unsupported parameters even when those parameters are not explicitly set in the CLI command. Specifically,temperature
andtop_p
appear to be sent to the API, causing 400 errors.However, if you set
--temperature "1"
and--top_p "1"
, then no error is produced and a chat response is obtained.To Reproduce
--temperature
nor--top_p
:"temperature": 0.5
:"temperature": 1
:--temperature "1"
, now with a new"Unsupported parameter: 'top_p' is not supported with this model."
error:--temperature "1" --top_p "1"
:Code snippets
OS
macOS
Python version
Python v3.12.5
Library version
openai 1.61.0
The text was updated successfully, but these errors were encountered: