How to make asynchronous HTTP requests in PHP

Is there a way in PHP to make asynchronous HTTP calls? I don't care about the response, I just want to do something like file_get_contents(), but not wait for the request to finish before executing the rest of my code. This would be super useful for setting off "events" of a sort in my application, or triggering long processes.

Any ideas?


The answer I'd previously accepted didn't work. It still waited for responses. This does work though, taken from How do I make an asynchronous GET request in PHP?

function post_without_wait($url, $params)
{
    foreach ($params as $key => &$val) {
      if (is_array($val)) $val = implode(',', $val);
        $post_params[] = $key.'='.urlencode($val);
    }
    $post_string = implode('&', $post_params);

    $parts=parse_url($url);

    $fp = fsockopen($parts['host'],
        isset($parts['port'])?$parts['port']:80,
        $errno, $errstr, 30);

    $out = "POST ".$parts['path']." HTTP/1.1\r\n";
    $out.= "Host: ".$parts['host']."\r\n";
    $out.= "Content-Type: application/x-www-form-urlencoded\r\n";
    $out.= "Content-Length: ".strlen($post_string)."\r\n";
    $out.= "Connection: Close\r\n\r\n";
    if (isset($post_string)) $out.= $post_string;

    fwrite($fp, $out);
    fclose($fp);
}

If you control the target that you want to call asynchronously (e.g. your own "longtask.php"), you can close the connection from that end, and both scripts will run in parallel. It works like this:

  1. quick.php opens longtask.php via cURL (no magic here)
  2. longtask.php closes the connection and continues (magic!)
  3. cURL returns to quick.php when the connection is closed
  4. Both tasks continue in parallel

I have tried this, and it works just fine. But quick.php won't know anything about how longtask.php is doing, unless you create some means of communication between the processes.

Try this code in longtask.php, before you do anything else. It will close the connection, but still continue to run (and suppress any output):

while(ob_get_level()) ob_end_clean();
header('Connection: close');
ignore_user_abort();
ob_start();
echo('Connection Closed');
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush();
flush();

The code is copied from the PHP manual's user contributed notes and somewhat improved.


You can do trickery by using exec() to invoke something that can do HTTP requests, like wget, but you must direct all output from the program to somewhere, like a file or /dev/null, otherwise the PHP process will wait for that output.

If you want to separate the process from the apache thread entirely, try something like (I'm not sure about this, but I hope you get the idea):

exec('bash -c "wget -O (url goes here) > /dev/null 2>&1 &"');

It's not a nice business, and you'll probably want something like a cron job invoking a heartbeat script which polls an actual database event queue to do real asynchronous events.


As of 2018, Guzzle has become the defacto standard library for HTTP requests, used in several modern frameworks. It's written in pure PHP and does not require installing any custom extensions.

It can do asynchronous HTTP calls very nicely, and even pool them such as when you need to make 100 HTTP calls, but don't want to run more than 5 at a time.

Concurrent request example

use GuzzleHttp\Client;
use GuzzleHttp\Promise;

$client = new Client(['base_uri' => 'http://httpbin.org/']);

// Initiate each request but do not block
$promises = [
    'image' => $client->getAsync('/image'),
    'png'   => $client->getAsync('/image/png'),
    'jpeg'  => $client->getAsync('/image/jpeg'),
    'webp'  => $client->getAsync('/image/webp')
];

// Wait on all of the requests to complete. Throws a ConnectException
// if any of the requests fail
$results = Promise\unwrap($promises);

// Wait for the requests to complete, even if some of them fail
$results = Promise\settle($promises)->wait();

// You can access each result using the key provided to the unwrap
// function.
echo $results['image']['value']->getHeader('Content-Length')[0]
echo $results['png']['value']->getHeader('Content-Length')[0]

See http://docs.guzzlephp.org/en/stable/quickstart.html#concurrent-requests


You can use this library: https://github.com/stil/curl-easy

It's pretty straightforward then:

<?php
$request = new cURL\Request('http://yahoo.com/');
$request->getOptions()->set(CURLOPT_RETURNTRANSFER, true);

// Specify function to be called when your request is complete
$request->addListener('complete', function (cURL\Event $event) {
    $response = $event->response;
    $httpCode = $response->getInfo(CURLINFO_HTTP_CODE);
    $html = $response->getContent();
    echo "\nDone.\n";
});

// Loop below will run as long as request is processed
$timeStart = microtime(true);
while ($request->socketPerform()) {
    printf("Running time: %dms    \r", (microtime(true) - $timeStart)*1000);
    // Here you can do anything else, while your request is in progress
}

Below you can see console output of above example. It will display simple live clock indicating how much time request is running:


animation