Edit report at https://bugs.php.net/bug.php?id=64518&edit=1
ID: 64518
User updated by: php at richardneill dot org
Reported by: php at richardneill dot org
Summary: optimisation for "for ($i=0; $i<count($array);$i++)"
when $array is global
Status: Not a bug
Type: Bug
Package: Performance problem
PHP Version: 5.4.13
Block user comment: N
Private report: N
New Comment:
I see your point about optimisations being too low-level, but some of these
things are hugely different in performance. Try running the following script:
http://www.richardneill.org/tmp/php-perf.txt
I notice 3 things.
1) Updating a variable in-place is 1000x slower than creating a new one.
Fast: $n = count($data);
Slow: $data = count($data);
2) Using $GLOBALS['var'] inside a function is very similar to using the same
variable in the main scope. Using "global $var" is always at least a bit slower.
3) Using global for just reading a variable is 100x slower than reading it from
$GLOBALS inside a function.
Fast: $n = count($GLOBALS['data'])
Slow: global $data; $n = count ($data)
I think that (3) might be a bug.
I also think that these differences are far from negligible, and so PHP's
documentation *should* mention them.
Previous Comments:
------------------------------------------------------------------------
[2013-03-26 15:57:08] [email protected]
It's not always faster. Which one is faster is not very predictable without
good knowledge of the engine. We don't address such low-level details in the
manual.
------------------------------------------------------------------------
[2013-03-26 15:15:38] php at richardneill dot org
> The issue you are seeing is that `global $data` is a by-reference fetch
> (unlike `$data = $GLOBALS['data']`, which is by-value). The by-ref pass
> prevents us from doing a copy-on-write optimization when passing the array to
> the by-value count() function, so the array has to be copied every single
> time.
Thanks for the explanation. That makes some sense as an explanation. However,
it also implies that use of "$GLOBALS['var']" is always faster, sometimes much
faster, than "global $var". If this is true, might I suggest that this is a
documentation issue that should be addressed?
------------------------------------------------------------------------
[2013-03-26 13:34:40] [email protected]
The issue you are seeing is that `global $data` is a by-reference fetch (unlike
`$data = $GLOBALS['data']`, which is by-value). The by-ref pass prevents us
from doing a copy-on-write optimization when passing the array to the by-value
count() function, so the array has to be copied every single time.
The specific count() case is something we could optimize by adding a read-only
passing mode for function arguments (that never separates zvals). As that is
something that's also useful for a few other things, it might well be worth
doing.
------------------------------------------------------------------------
[2013-03-26 06:34:44] php at richardneill dot org
... and getting closer to the source of the problem...
function a(){ //about 1400 us
global $data;
$n = $data;
}
function b(){ //much much quicker: about 18us
$n = $GLOBALS['data'];
}
Something is definitely wrong here, that makes "global" so much slower than
$GLOBALS.
Also, this isn't a read-only optimisation which makes b() much quicker than
a();
it still applies even when we want to copy back into the global scope.
Consider:
function x(){
$n = $GLOBALS['data'];
$GLOBALS['data'] = count($n);
}
and x() is always about as fast as b().
A secondary problem: if x() is rewritten as:
function y(){
$GLOBALS['data'] = count($GLOBALS['data']);
}
then it can be either fast or slow depending on whether other function calls
contain "global $data" or not.
------------------------------------------------------------------------
[2013-03-26 06:09:30] php at richardneill dot org
Here's a simpler test case. The problem is actually that accessing global
variables from functions is much slower than local variables (and I was seeing
this exacerbated by looping over them with count).
Note that passing the data directly as a function argument is much faster than
passing it as a global variable - this seems to me to be the root of the
problem. i.e.
function ($a,$b,$c,$large_array){
//do stuff - this is fast.
}
function ($a,$b,$c){
global $large_array
//do stuff - this is SLOW.
}
--------------------------------------------
#!/usr/bin/php -ddisplay_errors=E_ALL
<?
$NUMBER = 100000; //adjust for your computer.
for ($i = 0; $i< $NUMBER; $i++){
$data[$i] = "$i";
}
echo "Copy in main()...\n";
$t1 = microtime(true);
$copy = $data;
$t2 = microtime(true);
$time = ($t2 - $t1) * 1E6;
echo "Finished in $time microseconds.\n\n"; #5.0 us
function f1(){
global $NUMBER;
echo "Copy local variable in function...\n";
for ($i = 0; $i< $NUMBER; $i++){
$data_loc[$i] = "$i";
}
$t1 = microtime(true);
$copy = $data_loc;
$t2 = microtime(true);
$time = ($t2 - $t1) *1E6;
echo "Finished in $time microseconds.\n\n"; #1.9 us
}
function f2(){
global $data;
echo "Copy global variable into function...\n";
$t1 = microtime(true);
$n = $data;
$t2 = microtime(true);
$time = ($t2 - $t1) *1E6;
echo "Finished in $time microseconds.\n\n"; #9557 us
}
function f3($data){
echo "Pass variable into function...\n";
$t1 = microtime(true);
$n = $data;
$t2 = microtime(true);
$time = ($t2 - $t1) *1E6;
echo "Finished in $time microseconds.\n\n"; #0.95 us
}
f1();
f2();
f3($data);
echo "Note that as \$NUMBER is changed by factors of 10, the first two times
don't change much, but the latter scales as O(\$NUMBER).\n";
?>
------------------------------------------------------------------------
The remainder of the comments for this report are too long. To view
the rest of the comments, please view the bug report online at
https://bugs.php.net/bug.php?id=64518
--
Edit this bug report at https://bugs.php.net/bug.php?id=64518&edit=1