Why are PHP function calls *so* expensive?

Is the overhead for calling a user function really that big? Or rather is it really that big now? Both PHP and computer hardware have advanced in leaps and bounds in the nearly 7 years since this question was originally asked.

I've written my own benchmarking script below which calls mt_rand() in a loop both directly and via a user-function call:

const LOOPS = 10000000;

function myFunc ($a, $b)
{
    return mt_rand ($a, $b);
}

// Call mt_rand, simply to ensure that any costs for setting it up on first call are already accounted for
mt_rand (0, 1000000);

$start = microtime (true);
for ($x = LOOPS; $x > 0; $x--)
{
    mt_rand (0, 1000000);
}
echo "Inline calling mt_rand() took " . (microtime(true) - $start) . " second(s)\n";

$start = microtime (true);
for ($x = LOOPS; $x > 0; $x--)
{
    myFunc (0, 1000000);
}
echo "Calling a user function took " . (microtime(true) - $start) . " second(s)\n";

Results on PHP 7 on a 2016 vintage i5 based desktop (More specifically, Intel® Core™ i5-6500 CPU @ 3.20GHz × 4) are as follows:

Inline calling mt_rand() took 3.5181620121002 second(s) Calling a user function took 7.2354700565338 second(s)

The overhead of calling a user function appears to roughly double the runtime. But it took 10 million iterations for it to become particularly noticeable. This means that in most cases the differences between inline code and a user function are likely to be negligible. You should only really worry about that kind of optimisation in the innermost loops of your program, and even then only if benchmarking demonstrate a clear performance problem there. Anything else would be a micro-optimisation that would yield little to no meaningful performance benefit for added complexity in the source code.

If your PHP script is slow then the odds are almost certainly that it's going to be down to I/O or poor choice of algorithm rather than function call overhead. Connecting to a database, doing a CURL request, writing to a file or even just echoing to stdout are all orders of magnitude more expensive than calling a user function. If you don't believe me, have mt_rand and myfunc echo their output and see how much slower the script runs!

In most cases the best way to optimise a PHP script is to minimise the amount of I/O it has to do (only select what you need in DB queries rather than relying on PHP to filter out unwanted rows, for example), or get it to cache I/O operations though something such as memcache to reduce the cost of I/O to files, databases, remote sites, etc


Function calls are expensive in PHP because there's lot of stuff being done.

Note that isset is not a function (it has a special opcode for it), so it's faster.

For a simple program like this:

<?php
func("arg1", "arg2");

There are six (four + one for each argument) opcodes:

1      INIT_FCALL_BY_NAME                                       'func', 'func'
2      EXT_FCALL_BEGIN                                          
3      SEND_VAL                                                 'arg1'
4      SEND_VAL                                                 'arg2'
5      DO_FCALL_BY_NAME                              2          
6      EXT_FCALL_END                                            

You can check the implementations of the opcodes in zend_vm_def.h. Prepend ZEND_ to the names, e.g. for ZEND_INIT_FCALL_BY_NAME and search.

ZEND_DO_FCALL_BY_NAME is particularly complicated. Then there's the the implementation of the function itself, which must unwind the stack, check the types, convert the zvals and possibly separate them and to the actual work...


Function calls are expensive for the reason perfectly explained by @Artefacto above. Note that their performance is directly tied to the number of parameters/arguments involved. This is one area that I've paid close attention to while developing my own applications framework. When it makes sense and possible to avoid a function call, I do.

One such example is a recent replacement of is_numeric() and is_integer() calls with a simple boolean test in my code, especially when several calls to these functions may be made. While some may think that such optimizations are meaningless, I've noticed a dramatic improvement in the responsiveness of my websites through this type of optimization work.

The following quick test will be TRUE for a number and FALSE for anything else.

if ($x == '0'.$x) { ... }

Much faster than is_numeric() and is_integer(). Again, only when it makes sense, it's perfectly valid to use some optimizations.


I would contend that they are not. You're not actually testing a function call at all. You're testing the difference between a low-level out of bounds check (isset) and walking through a string to count the number of bytes (strlen).

I can't find any info specific to PHP, but strlen is usually implemented something like (including function call overhead):

$sp += 128;
$str->address = 345;
$i = 0;
while ($str[$i] != 0) {
    $i++;
}
return $i < $length;

An out of bounds check would typically be implemented something like:

return $str->length < $length;

The first one is looping. The second one is a simple test.

Tags:

Php

Function