You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Confirm this is a feature request for the Python library and not the underlying OpenAI API.
This is a feature request for the Python library
Describe the feature or improvement you're requesting
Hi OpenAI team 👋
We’ve noticed that, as of now, the AzureOpenAI and AsyncAzureOpenAI clients do not expose the .responses resource like the default OpenAI client does. This is a bit limiting, especially considering that Azure OpenAI has recently added support for the /openai/responses endpoint as part of the 2025-03-01-preview API version.
raises an AttributeError because .responses is not available on that class.
It would be great if .responses was added to AzureOpenAI and AsyncAzureOpenAI, similarly to how .chat.completions are exposed.
• Azure now supports /openai/responses endpoint
• model / deployment ID is passed in the JSON body, not in the path
• No need to add it to _deployments_endpoints
• Adding this would make the SDK consistent and easier to use for Azure users
Thanks in advance! Happy to contribute a PR if you’re open to it.
Best,
Nikolai
Additional context
No response
The text was updated successfully, but these errors were encountered:
client = AsyncAzureOpenAI(...)
await client.responses.create(...)
results in an AttributeError because the .responses property isn't available.
Since Azure now supports the /openai/responses endpoint (using the 2025-03-01-preview API version), it makes sense to add this feature. It would align the behavior of these clients with how .chat.completions is implemented, making everything more consistent. Plus, because the model/deployment ID is passed in the JSON body instead of the endpoint path, integrating this shouldn't require major changes.
Overall, this enhancement would make the client more user-friendly for Azure users. Thanks again for proposing this improvement, and it's great to know you're willing to help with a PR if needed!
@NikGor, @kiranimmadi2, which version of the openai library are you encountering this with? Azure support for /responses should certainly be present; basic Azure OpenAI use of the synchronous client is documented on learn.microsoft.com and I just validated the following quick modification to use the async client with 1.74.1 (and an earlier 1.69.0 environment for a positive comparison):
importasyncioimportosfromopenaiimportAsyncAzureOpenAIasyncdefmain():
client=AsyncAzureOpenAI(
api_key=os.getenv("AZURE_OPENAI_API_KEY"),
api_version="2025-03-01-preview",
azure_endpoint=os.getenv("AZURE_OPENAI_ENDPOINT")
)
response=awaitclient.responses.create(
model="gpt-4o-mini", # replace with your model deployment name input="This is a test."#truncation="auto" required when using computer-use-preview model.
)
print(response.output[0]);
asyncio.run(main())
Confirm this is a feature request for the Python library and not the underlying OpenAI API.
Describe the feature or improvement you're requesting
Hi OpenAI team 👋
We’ve noticed that, as of now, the AzureOpenAI and AsyncAzureOpenAI clients do not expose the .responses resource like the default OpenAI client does. This is a bit limiting, especially considering that Azure OpenAI has recently added support for the /openai/responses endpoint as part of the 2025-03-01-preview API version.
Currently, attempting to use:
client = AsyncAzureOpenAI(...)
await client.responses.create(...)
raises an AttributeError because .responses is not available on that class.
It would be great if .responses was added to AzureOpenAI and AsyncAzureOpenAI, similarly to how .chat.completions are exposed.
• Azure now supports /openai/responses endpoint
• model / deployment ID is passed in the JSON body, not in the path
• No need to add it to _deployments_endpoints
• Adding this would make the SDK consistent and easier to use for Azure users
Thanks in advance! Happy to contribute a PR if you’re open to it.
Best,
Nikolai
Additional context
No response
The text was updated successfully, but these errors were encountered: