Hi,
here's a quick fix for the bug40236 test failing when run from inside a
directory including spaces (e.g. C:\Documents and Settings\ or /home/Timm
Friebe/):
Index: Zend/tests/bug40236.phpt
===
RCS file:
Em Sáb, 2009-01-10 às 17:37 +0100, Timm Friebe escreveu:
Hi,
here's a quick fix for the bug40236 test failing when run from inside a
directory including spaces (e.g. C:\Documents and Settings\ or /home/Timm
Friebe/):
Index: Zend/tests/bug40236.phpt
Hi,
I wrote some daemon scripts for a web crawler in PHP. Now I spent one
day to work around the growing memory consumtion of these scripts.
Since I use a MySQL connection, Syslog and many classes, I wanted to let
the script run a while before restarting. So the scripts had lines like:
$i =
First, this is not an internals question. But, I would be not many
people on the general list could help you.
I wrote some daemon scripts for a web crawler in PHP. Now I spent one
day to work around the growing memory consumtion of these scripts.
I have PHP daemons running for weeks that
Since I use a MySQL connection, Syslog and many classes, I wanted to let
the script run a while before restarting. So the scripts had lines like:
$i = 1000;
while( --$i )
{
while( run() ); // run returns false if there are no pending jobs
gc_collect_cycles();
// echo
Hello,
Em Dom, 2009-01-11 às 01:45 +0100, Olivier Bonvalet escreveu:
Hello,
this bug was corrected in CVS on 17 oct, but I can't find the related
patch on http://news.php.net/php.cvs/start/53560
Does someone know which one is it ? I would like try to backport it to
PHP 5.2.6
Thanks in
Thanks Felipe !
Felipe Pena a écrit :
Hello,
Em Dom, 2009-01-11 às 01:45 +0100, Olivier Bonvalet escreveu:
Hello,
this bug was corrected in CVS on 17 oct, but I can't find the related
patch on http://news.php.net/php.cvs/start/53560
Does someone know which one is it ? I would like try to
Hi,
albeit *not* as a daemon, we've successfully developed a Crawler in PHP
within our company. It can run for hours without a leak, if I remember
correctly it's peak memory consumption is below 64MB. However we're
crawling only a small amount of URLs, just around 10.000 .
As Brian mentioned: