Skip to content

CLI erroneously sends unsupported parameters (temperature/top_p) to the o3-mini model #2072

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
1 task done
MrJarnould opened this issue Feb 2, 2025 · 5 comments
Closed
1 task done
Labels
bug Something isn't working

Comments

@MrJarnould
Copy link

MrJarnould commented Feb 2, 2025

Confirm this is an issue with the Python library and not an underlying OpenAI API

  • This is an issue with the Python library

Describe the bug

Using openai api chat.completions.create with the newly released o3-mini-2025-01-31 model triggers errors about unsupported parameters even when those parameters are not explicitly set in the CLI command. Specifically, temperature and top_p appear to be sent to the API, causing 400 errors.

However, if you set --temperature "1" and --top_p "1", then no error is produced and a chat response is obtained.

Note: installed openai using pipx v1.7.1

To Reproduce

  1. Use the CLI without specifying --temperature nor --top_p:
❯ openai api chat.completions.create --message "user" "What's the capital of France?" -m "o3-mini-2025-01-31"
Error: Error code: 400 - {'error': {'message': "Unsupported parameter: 'temperature' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'temperature', 'code': 'unsupported_parameter'}}
  1. Confirm the same request works using cURL:
❯ curl \
  https://api.openai.com/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer $OPENAI_API_KEY" \
  -d '{
    "model": "o3-mini-2025-01-31",
    "messages": [
      {
        "role": "user",
        "content": [
          {
            "type": "text",
            "text": "What'\''s the capital of France?"
          }
        ]
      }
    ],
    "response_format": {
      "type": "text"
    },
    "reasoning_effort": "low"
  }'
{
  "id": "chatcmpl-AwNe73vdAtaqbpdecVpG3XNRwfLbt",
  "object": "chat.completion",
  "created": 1738477283,
  "model": "o3-mini-2025-01-31",
  "choices": [
    {
      "index": 0,
      "message": {
        "role": "assistant",
        "content": "The capital of France is Paris.",
        "refusal": null
      },
      "finish_reason": "stop"
    }
  ],
  "usage": {
    "prompt_tokens": 12,
    "completion_tokens": 17,
    "total_tokens": 29,
    "prompt_tokens_details": {
      "cached_tokens": 0,
      "audio_tokens": 0
    },
    "completion_tokens_details": {
      "reasoning_tokens": 0,
      "audio_tokens": 0,
      "accepted_prediction_tokens": 0,
      "rejected_prediction_tokens": 0
    }
  },
  "service_tier": "default",
  "system_fingerprint": "fp_8bcaa0ca21"
}
  1. Observe that the cURL command fails when setting "temperature": 0.5:
❯ curl \
  https://api.openai.com/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer $OPENAI_API_KEY" \
  -d '{
    "model": "o3-mini-2025-01-31",
    "messages": [
      {
        "role": "user",
        "content": [
          {
            "type": "text",
            "text": "What'\''s the capital of France?"
          }
        ]
      }
    ],
    "response_format": {
      "type": "text"
    },
    "temperature": 0.5,
    "reasoning_effort": "low"
  }'
{
  "error": {
    "message": "Unsupported parameter: 'temperature' is not supported with this model.",
    "type": "invalid_request_error",
    "param": "temperature",
    "code": "unsupported_parameter"
  }
}
  1. Observe that the cURL command works when setting "temperature": 1:
❯ curl \
  https://api.openai.com/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer $OPENAI_API_KEY" \
  -d '{
    "model": "o3-mini-2025-01-31",
    "messages": [
      {
        "role": "user",
        "content": [
          {
            "type": "text",
            "text": "What'\''s the capital of France?"
          }
        ]
      }
    ],
    "response_format": {
      "type": "text"
    },
    "temperature": 1,
    "reasoning_effort": "low"
  }'
{
  "id": "chatcmpl-AwNYOeW1akjt6dMLkH3UNSrdXk5tV",
  "object": "chat.completion",
  "created": 1738476928,
  "model": "o3-mini-2025-01-31",
  "choices": [
    {
      "index": 0,
      "message": {
        "role": "assistant",
        "content": "The capital of France is Paris.",
        "refusal": null
      },
      "finish_reason": "stop"
    }
  ],
  "usage": {
    "prompt_tokens": 12,
    "completion_tokens": 17,
    "total_tokens": 29,
    "prompt_tokens_details": {
      "cached_tokens": 0,
      "audio_tokens": 0
    },
    "completion_tokens_details": {
      "reasoning_tokens": 0,
      "audio_tokens": 0,
      "accepted_prediction_tokens": 0,
      "rejected_prediction_tokens": 0
    }
  },
  "service_tier": "default",
  "system_fingerprint": "fp_8bcaa0ca21"
}
  1. Observe the CLI still fails when setting --temperature "1", now with a new "Unsupported parameter: 'top_p' is not supported with this model." error:
❯ openai api chat.completions.create --message "user" "What's the capital of France?" -m "o3-mini-2025-01-31" --temperature "1"
Error: Error code: 400 - {'error': {'message': "Unsupported parameter: 'top_p' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'top_p', 'code': 'unsupported_parameter'}}
  1. Observe that the CLI request works when setting --temperature "1" --top_p "1":
❯ openai api chat.completions.create --message "user" "What's the capital of France?" -m "o3-mini-2025-01-31" --temperature "1" --top_p "1"
The capital of France is Paris.

Code snippets

OS

macOS

Python version

Python v3.12.5

Library version

openai 1.61.0

@MrJarnould MrJarnould added the bug Something isn't working label Feb 2, 2025
@RobertCraigie
Copy link
Collaborator

Thanks for the report! This will be fixed in the next release #2078

@csjcoderepo
Copy link

Thank you!!

@fateme-hshm96
Copy link

Hello. I still get the same error when setting the temperature with o3-mini. Do you have an estimate of when this might get fixed?
Thank you!

@RobertCraigie
Copy link
Collaborator

For API level support please go to the community.openai.com forum.

@py-goh
Copy link

py-goh commented Mar 6, 2025

Hi, I think this same error happened on the python CLI not from the openAI API

If you try to create an assistant with 4-o-mini model: client.beta.assistants.create()

And try to update the model to o3-mini: client.beta.assistants.update()

You will end up getting this error:

{
...
File \"/usr/local/lib/python3.8/dist-packages/openai/resources/beta/assistants.py\", line 355, in update\n    return self._post(\n  File \"/usr/local/lib/python3.8/dist-packages/openai/_base_client.py\", line 1242, in post\n    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))\n  File \"/usr/local/lib/python3.8/dist-packages/openai/_base_client.py\", line 919, in request\n    return self._request(\n  File \"/usr/local/lib/python3.8/dist-packages/openai/_base_client.py\", line 1023, in _request\n    raise self._make_status_error_from_response(err.response) from None\nopenai.BadRequestError: Error code: 400 - {'error': {'message': \"Unsupported value: 'reasoning_effort' does not support 'null' with this model.\", 'type': 'invalid_request_error', 'param': None, 'code': 'unsupported_value'}}\n"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

5 participants