Http:pool with concurrency limit? #43063
Replies: 3 comments 5 replies
-
Hey! Did you manage to find anything related? I need to solve the same problem. |
Beta Was this translation helpful? Give feedback.
-
I have a similar solution, although mine is a while loop at the outside, creating new pools as we go. Sadly Laravel's Http doesn't support concurrency limits as well as generator functions.... |
Beta Was this translation helpful? Give feedback.
-
I've solved it. code below has the same outcome as without use Http;
use GuzzleHttp\Promise\Each;
use Illuminate\Http\Client\Pool;
Http::pool(function (Pool $pool) {
return [
Each::ofLimit(
(function () use ($pool) {
for ($i = 1; $i <= 10; $i++) {
yield $pool->async()
->get('https://httpbin.org/delay/' . $i)
->then(fn() => dump($i . ' done'));
}
})(),
10 // this is the concurrency
)
];
}); For anyone interested, it all boils down to the following Macroable mixin: use Closure;
use GuzzleHttp\Promise\Each;
use Illuminate\Http\Client\Pool;
use Illuminate\Http\Client\Factory;
class HttpMixin
{
public function concurrent(): Closure
{
return function (
int $concurrency,
callable $requests,
callable $onFulfilled = null,
callable $onRejected = null
): void {
/**
* @var $this Factory
*/
$requests = $requests(...)(new Pool($this));
Each::ofLimit(
$requests,
$concurrency,
$onFulfilled,
$onRejected
)->wait();
};
}
} usage: use Http;
use Generator;
use Illuminate\Http\Client\Pool;
Http::mixin(new HttpMixin());
Http::concurrent(
10,
function (Pool $pool): Generator {
for ($i = 1; $i <= 10; $i++) {
yield $pool->async()
->get('https://httpbin.org/delay/' . $i)
->then(fn() => dump($i . ' done'));
}
}
); No need for Note: I used |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hey all,
Is there a way of allowing for a concurrency limit when using Http:pool ?
My issue is that I have several hundred requests to make, but I want to tread softly on the API, so I want to only have a certain number of requests in flight at one time.
I know that I can use Guzzle directly to do this using GuzzleHttp\Pool (as per https://docs.guzzlephp.org/en/stable/quickstart.html#concurrent-requests) but I would prefer to stick with the Http::pool method.
My current best solution is just to load up the pool with requests up to my concurrency limit, then wait() for them to finish like this:
In the while loop I am just arbitrarily waiting on the first request. It does not seem to matter which one I wait for as any call to wait() will actually resolve all pending promises in the pool, so the very first wait() will wait for all requests to complete, not just the first call.
This has the effect of spooling up the max number of concurrent requests, then waiting on them all to complete and repeating. The solution works at keeping me below my concurrency limit, but it not very efficient.
Ideally I want to launch a new request when one request completes, therefore keeping at the concurrency limit until all calls are complete.
Does anyone have any ideas or pointers on how I might make this work?
Cheers,
Sab,
Beta Was this translation helpful? Give feedback.
All reactions