From:             [EMAIL PROTECTED]
Operating system: RH Linux 2.4.9-31
PHP version:      4.1.1
PHP Bug Type:     Output Control
Bug description:  Large output data causes output buffering to crash

This script causes PHP to crash (sigsegv, return nothing OR to cause wget
to tell "HTTP request sent, awaiting response... End of file while parsing
headers. Retrying.") almost any time. The include.txt file I used was
larger than 100k and was just a simple text. The content of the file is
not important at all, I tried several versions.

<?php

function getMicrotime()
  {
    list($usec, $sec) = explode(" ", microtime());

    return ((float)$usec + (float)$sec);
  }

function timer($buffer)
  {
    global $startTime;
    
    $endTime = getMicrotime();
    
    $diff = sprintf("%.5f", $endTime - $startTime);
    
    return $buffer . "\nExecution time: $diff sec<br>\n";
  }

$startTime = getMicrotime();
ob_start("timer");

for ($i = 0; $i < 500; $i++)
  {
    $fh = fopen("include.txt", "r");
    $cmd = fread($fh, 1048576);
    fclose($fh);

    echo "$i $cmd\n";
  }

?>


I compiled PHP with

./configure --prefix=/usr/local/php --disable-short-tags
--enable-safe-mode --enable-ftp --with-mysql=/usr/local/mysql --with-zlib
--enable-memory-limit

-- 
Edit bug report at http://bugs.php.net/?id=16077&edit=1
-- 
Fixed in CVS:        http://bugs.php.net/fix.php?id=16077&r=fixedcvs
Fixed in release:    http://bugs.php.net/fix.php?id=16077&r=alreadyfixed
Need backtrace:      http://bugs.php.net/fix.php?id=16077&r=needtrace
Try newer version:   http://bugs.php.net/fix.php?id=16077&r=oldversion
Not developer issue: http://bugs.php.net/fix.php?id=16077&r=support
Expected behavior:   http://bugs.php.net/fix.php?id=16077&r=notwrong
Not enough info:     http://bugs.php.net/fix.php?id=16077&r=notenoughinfo
Submitted twice:     http://bugs.php.net/fix.php?id=16077&r=submittedtwice

Reply via email to