Run PHP Task Asynchronously

Another way to fork processes is via curl. You can set up your internal tasks as a webservice. For example:

  • http://domain/tasks/t1
  • http://domain/tasks/t2

Then in your user accessed scripts make calls to the service:

$service->addTask('t1', $data); // post data to URL via curl

Your service can keep track of the queue of tasks with mysql or whatever you like the point is: it's all wrapped up within the service and your script is just consuming URLs. This frees you up to move the service to another machine/server if necessary (ie easily scalable).

Adding http authorization or a custom authorization scheme (like Amazon's web services) lets you open up your tasks to be consumed by other people/services (if you want) and you could take it further and add a monitoring service on top to keep track of queue and task status.

  • http://domain/queue?task=t1
  • http://domain/queue?task=t2
  • http://domain/queue/t1/100931

It does take a bit of set-up work but there are a lot of benefits.


When you just want to execute one or several HTTP requests without having to wait for the response, there is a simple PHP solution, as well.

In the calling script:

$socketcon = fsockopen($host, 80, $errno, $errstr, 10);
if($socketcon) {   
   $socketdata = "GET $remote_house/script.php?parameters=... HTTP 1.1\r\nHost: $host\r\nConnection: Close\r\n\r\n";      
   fwrite($socketcon, $socketdata); 
   fclose($socketcon);
}
// repeat this with different parameters as often as you like

On the called script.php, you can invoke these PHP functions in the first lines:

ignore_user_abort(true);
set_time_limit(0);

This causes the script to continue running without time limit when the HTTP connection is closed.


I've used the queuing approach, and it works well as you can defer that processing until your server load is idle, letting you manage your load quite effectively if you can partition off "tasks which aren't urgent" easily.

Rolling your own isn't too tricky, here's a few other options to check out:

  • GearMan - this answer was written in 2009, and since then GearMan looks a popular option, see comments below.
  • ActiveMQ if you want a full blown open source message queue.
  • ZeroMQ - this is a pretty cool socket library which makes it easy to write distributed code without having to worry too much about the socket programming itself. You could use it for message queuing on a single host - you would simply have your webapp push something to a queue that a continuously running console app would consume at the next suitable opportunity
  • beanstalkd - only found this one while writing this answer, but looks interesting
  • dropr is a PHP based message queue project, but hasn't been actively maintained since Sep 2010
  • php-enqueue is a recently (2017) maintained wrapper around a variety of queue systems
  • Finally, a blog post about using memcached for message queuing

Another, perhaps simpler, approach is to use ignore_user_abort - once you've sent the page to the user, you can do your final processing without fear of premature termination, though this does have the effect of appearing to prolong the page load from the user perspective.


If it just a question of providing expensive tasks, in case of php-fpm is supported, why not to use fastcgi_finish_request() function?

This function flushes all response data to the client and finishes the request. This allows for time consuming tasks to be performed without leaving the connection to the client open.

You don't really use asynchronicity in this way:

  1. Make all your main code first.
  2. Execute fastcgi_finish_request().
  3. Make all heavy stuff.

Once again php-fpm is needed.