Edit report at http://bugs.php.net/bug.php?id=52252&edit=1

 ID:               52252
 Updated by:       ras...@php.net
 Reported by:      wimroffel at planet dot nl
 Summary:          PHp seems to run out of memory
 Status:           Feedback
 Type:             Bug
 Package:          Output Control
 Operating System: Windows xp
 PHP Version:      5.3.2

 New Comment:

A better way to do what you are trying to do is to run each

file through a separate php processor so a fatal error won't

terminate your checking script.  Like this:



foreach(glob("./*.php") as $f) {

  $out = null;

  exec("php -l $f", $out, $ret);

  if($ret) {

    echo "Problem in $f\n{$out[1]}";

  }

}


Previous Comments:
------------------------------------------------------------------------
[2010-07-05 18:12:14] ras...@php.net

Are you actually getting an out of memory error?  



What I think is happening is that one of the files you

include has a syntax error in it which will immediately

abort your script.



Remove the ob_start() and run your script and watch for

errors.

------------------------------------------------------------------------
[2010-07-05 16:13:48] wimroffel at planet dot nl

Description:
------------
For my site I generate a large number of PHP pages from a PHP script.
Experience learns that PHP is not totally realiable and that some of the
pages will contain errors. The errors may be small but unlike html PHP
is unforgiving and it immediately results in a single php error line.



So I made a script to check all my php files. I use the output buffering
functions to direct their output to a file. I then measure the size of
this output. If this is extremely small it must be a PHP error line. 



After reading some 30 to 120 files PHP just stops. For a given directory
it is always the same. Total filesize (all files together) before the
stop varied between 500 and 3500 kb.

Test script:
---------------
function checkfile($myfile)

{ if (!file_exists($myfile)) return;

  echo $myfile."<br>";

  ob_start();

  include($myfile);

  $data = ob_get_contents();

  ob_end_clean();

  if(strlen($data) < 500) echo "<h2>".$myfile." is too small!</h2>";

}



  $dp = opendir("./"));

  while( $dirfile = readdir($dp))

  { if(($dirfile == ".") || ($dirfile == "..")) // all files are php

      continue;

   checkfile($dirfile);

  }

  closedir($dp);

Expected result:
----------------
I would expect that the script would process all the files in the
directory

Actual result:
--------------
Instead it just processes a certain number and then stops. This is no
timeout: it happens within a second.


------------------------------------------------------------------------



-- 
Edit this bug report at http://bugs.php.net/bug.php?id=52252&edit=1

Reply via email to