Skip to content

responses.parse returns an error in streaming mode #2305

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
1 task done
futuremojo opened this issue Apr 14, 2025 · 0 comments
Open
1 task done

responses.parse returns an error in streaming mode #2305

futuremojo opened this issue Apr 14, 2025 · 0 comments
Labels
bug Something isn't working

Comments

@futuremojo
Copy link

Confirm this is an issue with the Python library and not an underlying OpenAI API

  • This is an issue with the Python library

Describe the bug

When I use a Pydantic model in response.parse, it works fine in synchronous mode. But in asynchronous mode, I get an error.

See test code below.

To Reproduce

The code to reproduce the issue is below.

When the code is run, the synchronous version of response.parse returns a correct response. The asynchronous version results in this error:

AttributeError                            Traceback (most recent call last)
File ~/projects/pdf_summarizer/.venv/lib/python3.12/site-packages/openai/lib/_parsing/_responses.py:62, in parse_response(text_format, input_tools, response)
     59 solved_t = solve_response_format_t(text_format)
     60 output_list: List[ParsedResponseOutputItem[TextFormatT]] = []
---> 62 for output in response.output:
     63     if output.type == "message":
     64         content_list: List[ParsedContent[TextFormatT]] = []

AttributeError: 'str' object has no attribute 'output'

Code snippets

import json
from typing import List

import openai
from pydantic import BaseModel
import pytest


class CalendarEvent(BaseModel):
    name: str
    date: str
    participants: List[str]

def test_response_parse_sync():
    client = openai.OpenAI()

    response = client.responses.parse(
        model="gpt-4o-2024-08-06",
        input=[
            {"role": "system", "content": "Extract the event information."},
            {"role": "user", "content": "Alice and Bob are going to a science fair on Friday."},
        ],
        text_format=CalendarEvent,
    )
    return response

async def test_response_parse_async():
    client = openai.AsyncOpenAI()

    stream = await client.responses.parse(
        model="gpt-4o-2024-08-06",
        input=[
            {"role": "system", "content": "Extract the event information."},
            {"role": "user", "content": "Alice and Bob are going to a science fair on Friday."},
        ],
        text_format=CalendarEvent,
        stream=True,
    )
    async for event in stream:
        print(event)


# Returns a response conforming to the Pydantic model.
print(test_response_parse_sync())

# Errors out.
await test_response_parse_async()

OS

macOS

Python version

Python v3.12

Library version

OpenAI v1.73.0

@futuremojo futuremojo added the bug Something isn't working label Apr 14, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant