Skip to content

asyncio: #10 shutdown_asyncgens #11

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Oct 23, 2021
Merged

Conversation

delfick
Copy link
Owner

@delfick delfick commented Oct 19, 2021

Ensure any async generators that are left running past
run_until_complete (essentially only if we get a KeyboardInterrupt that
stops the run_until_complete) then those are finalized properly

Note I don't use loop.shutdown_asyncgens sothat I can make the generator
handle an asyncio.CancelledError rather than a GeneratorExit

assert info2 == []

async def doit():
ag = my_generator(info2)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think you can get the situation you want in a simpler (and maybe more realistic) scenario by simply not iterating the async generator all the way through - in this case calling await ag.__anext__() only once and not doing os.kill(). The expected output would then be info2 == [1, "cancelled", ("done", asyncio.CancelledError)]

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Without the os.kill the cancel_all_tasks will cancel it and shutdown_asyncgens doesn't actually do anything. Also the second __anext__ doesn't complete the generator, it's still running at that point. It's the third one that completes it (first __anext__ gets to first yield, second gets to second yield, third goes from there)

Copy link

@nzig nzig Oct 20, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, I tried this out myself and it's a a little complicated, but here's an example of a failure case without this PR:

import asyncio
from alt_pytest_asyncio.plugin import OverrideLoop

info2 = []


async def my_generator(info):
    try:
        info.append(1)
        yield
        info.append(2)
        yield
        info.append(3)
    except asyncio.CancelledError:
        info.append("cancelled")
        raise
    finally:
        info.append(("done", __import__("sys").exc_info()[0]))


ag = my_generator(info2)


async def doit():
    await ag.__anext__()
    await ag.__anext__()


with OverrideLoop(new_loop=True) as custom_loop:
    custom_loop.run_until_complete(doit())

assert info2 == [1, 2, "cancelled", ("done", asyncio.CancelledError)]

The key here is defining ag = my_generator(info2) outside. The reason is that asyncio uses sys.set_asyncgen_hooks() to set a finalizer that will athrow() any async generator when they are GC'd (see PEP525). Because there are no references to ag after doit() is done, it will probably get garbage collected (but this is not guaranteed), and then asyncio's finalizer will create a task to do athrow() on the async generator. This task will then get cancelled by cancel_all_tasks(), causing the test to pass without os.kill().

In other words, the test may fail occasionally without os.kill(), but this is up to GC. By forcing ag to remain alive we can get it to fail every time.

Copy link
Owner Author

@delfick delfick Oct 20, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

on looking at that I'm confused why it wouldn't complain that the generator is being run with a different loop to the one you created it in. But I suppose that firstiterhook doesn't get called till the first __anext__ so it could make sense that making the async generator object itself doesn't do anything with the loop.

I'll have a play later. If that removes the os.kill that'd be nice.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is no requirement to run an async generator only on one loop 🙂 :

>>> import asyncio
>>> async def foo():
...     yield 1
...     yield 2
... 
>>> g=foo()
>>> asyncio.run(g.__anext__())
1
>>> asyncio.run(g.__anext__())
2

I don't think an async generator has a concept of a "loop it was created in".

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for that, I understand now. It's because they're weak refs in the set that shutdown_asyncgens looks at and so that's why defining outside the context manager works because in that case the weakref is still valid inside the __aexit__ of the context manager.

Anyway, the PR has been updated to not need to send itself a signal for that test.

Does the implementation of the PR fix the problem that led to #10 being made?

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

asyncio.run(g.anext())
1
asyncio.run(g.anext())
2

Ever since 2016 for me has been a constant fight with asyncio with because of this project https://github.com/delfick/photons (and specifically the non open source version it's based off)

I'm a big fan of asyncio, but it has some weird edges that cause annoying surprises. The above surprises me because of how angry it gets when you have a future from one loop used by another. So I imagine it'd be fairly simple to make a generator where it complains about the above.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, this solves the problem that led to #10. Thanks!

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

wicked awesome! I assume I'll make a release next week. (and if #9 isn't ready by then I'll make a release without that)

@delfick delfick force-pushed the f-asyncio/shutdown-asyncgens branch 2 times, most recently from 7b64aec to 16f86bd Compare October 20, 2021 10:08
@delfick
Copy link
Owner Author

delfick commented Oct 20, 2021

given the test only fails on 3.6 and in #9 I plan on making this project 3.7+ only, I'm gonna merge this after I merge that PR.

assert info3 == [1]

custom_loop.run_until_complete(doit())
assert list(custom_loop.loop._asyncgens) == [ag]
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This depends on the implementation of the GC, because it assumes ag2 is collected before this line. That's probably fine given how CPython's GC works right now, but it's not guaranteed to stay the same, so this test might break in a future version of Python (or even current implementations of Python other than CPython).

Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is true. Probably fine for now shrugs

I'll put it in the "if someone complains about it" basket :)

Ensure any async generators that are left running past
run_until_complete (essentially only if we get a KeyboardInterrupt that
stops the run_until_complete) then those are finalized properly

Note I don't use loop.shutdown_asyncgens sothat I can make the generator
handle an asyncio.CancelledError rather than a GeneratorExit
@delfick delfick force-pushed the f-asyncio/shutdown-asyncgens branch from 16f86bd to c5f5d2a Compare October 23, 2021 00:35
@delfick delfick merged commit b1862d4 into main Oct 23, 2021
@delfick delfick deleted the f-asyncio/shutdown-asyncgens branch October 23, 2021 00:37
@delfick
Copy link
Owner Author

delfick commented Oct 23, 2021

I decided I had time this morning to push this out and you can find it in 0.6.0 :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants