-
-
Notifications
You must be signed in to change notification settings - Fork 540
Document how to wait on multiple futures #48
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
@dmwyatt I'd like to help you on this, could you give me an example of what you'd like to do for instance? What should we add to this example:
|
What you'd like to do is to be able to wait on both If this is the case, I just did something like that: #!/usr/bin/env python
import asyncio
import aioredis
import websockets
@asyncio.coroutine
def simulate_web_task():
redis = yield from aioredis.create_redis(('localhost', 6379))
print("$ Web server sleeps")
yield from asyncio.sleep(1)
print("$ Web server add task hello")
yield from redis.lpush("tasks", "hello")
yield from asyncio.sleep(2)
@asyncio.coroutine
def simulate_client():
websocket = yield from websockets.connect('ws://localhost:8765')
print("> Client sleeps")
yield from asyncio.sleep(2)
print("> Client sends Hello")
yield from websocket.send("Hello")
yield from asyncio.sleep(2)
@asyncio.coroutine
def server_handler(websocket, path):
redis = yield from aioredis.create_redis(('localhost', 6379))
client_task = websocket.recv()
web_task = redis.blpop("tasks")
print("# Create client and web tasks")
pending = {client_task, web_task}
counter = 0
while websocket.open and counter < 2:
done, pending = yield from asyncio.wait(
pending, return_when=asyncio.FIRST_COMPLETED)
for task in done:
if task is client_task:
client_message = task.result()
print("# Client says: %s" % client_message)
client_task = websocket.recv()
pending.add(client_task)
elif task is web_task:
web_message = task.result()
print("# Web server says: %s" % web_message[1].decode('utf-8'))
web_task = redis.blpop("tasks")
pending.add(web_task)
counter += 1
print("# End of demo")
start_server = websockets.serve(server_handler, 'localhost', 8765)
asyncio.get_event_loop().run_until_complete(start_server)
asyncio.get_event_loop().run_until_complete(
asyncio.wait({simulate_client(), simulate_web_task()})
) |
Correct me if I'm wrong:
should be:
|
I am trying to do this and the example in the docs doesn't work. In the example when one task yields it restarts both tasks. This means any data being waited on in the slower task is lost, asyncio.sleeps are ignored, and so on. I got it to work with the following which is more like how you'd do it with gevent. It would be simpler if
|
While the above code does work I can't figure out how to gracefully stop it to prevent pending task errors. To gracefully stop the loop it seems like you need access to both the loop and the tasks and of course you're not allowed to stop and start the loop from inside the loop itself. I can't get access to the websocket client outside of the loop so I can't create the tasks outside of the loop. By the time I have access to both the loop and the tasks I can't gracefully stop the loop because I'm inside the loop. |
This is really a question about multiplexing and synchronizing asyncio tasks, not about websockets. You haven't really described what you're trying to do, but I think the following should work:
Applying techniques for implicit async I/O with gevent to asyncio's explicit async I/O is unlikely to go well.
If you can propose a simpler / less magic API, I'm interested. The goal of this library is to make working with websockets in asyncio as straightforward as possible. |
Using the examples in the docs how are you meant to stop the loop without spitting out pending task errors? Interrupts always seem to originate from Also the third example on the intro page ends with In the end I came up with the following which works the same as the previous code but also stops gracefully with no pending task errors. However I still don't like how complicated it is to merely close the websocket.
I think when people are looking for a "Python websocket client" they are wanting something similar to the Javascript client. That means simply registering callbacks and not having to use queues or anything like that. |
Usually pending tasks being cancelled isn't a problem if you're shutting down the connection anyway. If they are, just catch
EDIT: oops I mixed up with what happens on the server side — it's a common question. On the client side, for your use case, I think you can do this:
or this:
I think you're writing more complicated code than what's actually needed because you aren't very familiar with asyncio yet and you end up fighting it instead of taking advantage of it. You should run three tasks in parallel:
If you prefer callback-based programming, you should stop using asyncio and websockets. The point of asyncio is to provide coroutine-based async I/O handling instead of callback-based. The point of websockets is to provide a coroutine-based API. So I'm afraid you ended up in the wrong place! Perhaps Twisted or Tornado would work better for you. As far as I'm concerned, I won't exchange coroutines for callbacks, exactly for the same reason I won't exchange functions for gotos. But I'm aware there's a learning curve (or, more precisely, an unlearning curve). |
It doesn't matter in terms of what the code is going to do but it does make a real mess of the logs. I found that registering signal handlers makes things behave a lot nicer. In the end I came up with the following:
I've found that this code will cleanly shut down whether that be caused by an interrupt or an exception. |
Thank you @kylemacfarlane for asking these questions here. And thank you @aaugustin for answering them so well. This comment section has saved me from certain disaster. I've been fighting with asyncio for many hours of every day of the last 3 weeks. This solves almost all of my problems. |
As requested here: edf3a32#commitcomment-10455621
The text was updated successfully, but these errors were encountered: