[PHP] memory leak - how to find it?

2006-07-31 Thread Robin Getz
I am trying to debug a php script that I downloaded, which has a memory 
leak in it.


I was looking for a way to find what variables were in php's memory, and 
what size, they were, but I couldn't find anything?


The script is a off-line wiki conversion tool (walks through a wiki to 
create a bunch of html files for off line viewing). As the tools walks the 
files, and does the conversion, I can see the memory consumption go up and 
up as it walks the files, until it hits the mem limit, and crashes.


Any suggestions appreciated.

Thanks
-Robin

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php



RE: [PHP] apache_child_terminate?

2005-01-04 Thread Robin Getz
Curt Zirzow wrote:
 I should be able to turn this on with 'child_terminate' in php.ini

 However, I do this, and when I do a phpinfo(); it returns a:

 apache2handler with only three Directives:
 - engine
 - last_modified
 - xbithack
Are you running apache in multithreaded per chance?
  http://php.net/manual/en/function.apache-child-terminate.php
Of course this begs the question why you want all php script's to 
terminage apache, this will only provide extra cpu usage.
I have a download script that for some reason, seems to be consuming lots 
of memory - no matter what I do.

Someone suggested apache_child_terminate after the download is done, to 
kill the apache process, and make sure that the memory is released back to 
the OS.

-Robin 

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


RE: [PHP] handling large files w/readfile

2005-01-04 Thread Robin Getz
Jason Wong wrote:
Are you using the above code on its own (ie not within some other code 
that may affect the memory usage)?
Well, herethe entire file (It is pretty short - only a 2 pages, but sorry 
in advance if anyone considers this bad form).

site is called with something like
http://blackfin.uclinux.org/frs/download.php/123/STAMP.jpg
Files are stored in:
$sys_upload_dir.$group_name.'/'.$filename
-- frs/download.php -
?php
/**
 * GForge FRS Facility
 *
 * Copyright 1999-2001 (c) VA Linux Systems
 * The rest Copyright 2002-2004 (c) GForge Team
 * http://gforge.org/
 *
 * @version   $Id: download.php,v 1.6 2004/10/08 23:05:29 gsmet Exp $
 *
 * This file is part of GForge.
 *
 * GForge is free software; you can redistribute it and/or modify
 * it under the terms of the GNU General Public License as published by
 * the Free Software Foundation; either version 2 of the License, or
 * (at your option) any later version.
 *
 * GForge is distributed in the hope that it will be useful,
 * but WITHOUT ANY WARRANTY; without even the implied warranty of
 * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
 * GNU General Public License for more details.
 *
 * You should have received a copy of the GNU General Public License
 * along with GForge; if not, write to the Free Software
 * Foundation, Inc., 59 Temple Place, Suite 330, Boston, 
MA  02111-1307  USA  */

$no_gz_buffer=true;
require_once('pre.php');
$arr=explode('/',$REQUEST_URI);
$file_id=$arr[3];
$res=db_query(SELECT frs_file.filename,frs_package.is_public,
frs_file.file_id,groups.unix_group_name,groups.group_id
FROM frs_package,frs_release,frs_file,groups
WHERE frs_release.release_id=frs_file.release_id
AND groups.group_id=frs_package.group_id
AND frs_release.package_id=frs_package.package_id
AND frs_file.file_id='$file_id');
if (db_numrows($res)  1) {
Header(Status: 404);
exit;
}
$is_public =db_result($res,0,'is_public'); 
$group_name=db_result($res,0,'unix_group_name');
$filename = db_result($res,0,'filename'); 
$release_id=db_result($res,0,'release_id');
$group_id = db_result($res,0,'group_id');

$Group = group_get_object($group_id);
if (!$Group || !is_object($Group) || $Group-isError()) {
exit_no_group();
}
if(!$Group-isPublic()) {
session_require(array('group' = $group_id)); }
//  Members of projects can see all packages //  Non-members can only see 
public packages
if(!$is_public) {
if (!session_loggedin() || (!user_ismember($group_id)  
!user_ismember(1,'A'))) {
exit_permission_denied();
}
}

/*
echo $group_name.'|'.$filename.'|'.$sys_upload_dir.$group_name.'/'.$filename;
if (file_exists($sys_upload_dir.$group_name.'/'.$filename)) {
echo 'br /file exists';
passthru($sys_upload_dir.$group_name.'/'.$filename);
}
*/
if (file_exists($sys_upload_dir.$group_name.'/'.$filename)) {
Header('Content-disposition: filename='.str_replace('', '', 
$filename).'');
Header(Content-type: application/binary);
length = filesize($sys_upload_dir.$group_name.'/'.$filename);
Header(Content-length: $length);

# Here is where all the problems start
readfile($sys_upload_dir.$group_name.'/'.$filename);
if (session_loggedin()) {
s = session_get_user();
us=$s-getID();
} else {
us=100;
}
res=db_query(INSERT INTO frs_dlstats_file 
(ip_address,file_id,month,day,user_id)
VALUES 
('$REMOTE_ADDR','$file_id','.date('Ym').','.date('d').','$us'));
} else {
Header(Status: 404);
}

?
=
If this runs for awhile things go very bad. This seems to be related to a 
specific download manager called NetAnts that seems to be popular in China.
http://www.netants.com/

Which attempts to open the same url for downloading 10-15 times at the same 
instant.

If I replace things with:
 snip =
if (file_exists($sys_upload_dir.$group_name.'/'.$filename)) {
# if the file is too big to download (10Meg) - use a different 
method than php
   $length = filesize($sys_upload_dir.$group_name.'/'.$filename);
Header('Content-disposition: filename='.str_replace('', '', 
$filename).'');
Header(Content-type: application/binary);
Header(Content-length: $length);

fp = fopen($sys_upload_dir.$group_name.'/'.$filename,'rb');
buff=0;
while (!feof($fp)) {
buff = fread($fp, 4096);
print $buff;
}
unset($buff);
fclose ($fp);
===  snip - rest is the same =
I get exactly the same problem - I come back and there are 2-3-4 apache 
processes that are consuming memory the size of the largest downloads.

The only way I can make things work with large downloads is to use this:
 snip 
if 

[PHP] apache_child_terminate?

2005-01-03 Thread Robin Getz
I am trying to turn on apache_child_terminate with PHP V4.3.10 / Apache 2.0.52
According to: 
http://php.planetmirror.com/manual/en/function.apache-child-terminate.php

I should be able to turn this on with 'child_terminate' in php.ini
However, I do this, and when I do a phpinfo(); it returns a:
apache2handler with only three Directives:
- engine
- last_modified
- xbithack
When I look at others with apache 1.3, it lists, child_terminate.
Does this function only work with Apache 1.3?
Thanks
-Robin  

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


RE: [PHP] handling large files w/readfile

2005-01-02 Thread Robin Getz
Rasmus Lerdorf wrote:
 
 $buff = 0;
 while (!feof($fp)) {
$buff = fread($fp, 4096);
print $buff;
 }
 unset($buff);
 fclose ($fp);
 
Well, the above code does not use more than 4K of ram plus a bit of 
overhead.  So if something is causing your processes to grow to 450M you 
need to look elsewhere because this code is definitely not the cause.
Well, the test case is:
1) above with big files = big apache processes - machine crashes
2) download big files with:
 Header(Location: .$html_pointer_to_file);
   = small apache processes - works great
So, I don't know if it can be anything else but that - so I am open to 
suggestions, or tests that anyone wants me to run.

-Robin 

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[PHP] PHP V4.3.10 / Apache2 Handler Question...

2005-01-01 Thread Robin Getz
I am trying to turn on apache_child_terminate with PHP V4.3.10 / Apache 2.0.52
When I do a phpinfo(); it returns a:
apache2handler with only three Directives:
- engine
- last_modified
- xbithack
When I look at others with apache 1.3, it lists, child_terminate, which I 
need to be able to set.

Does this function only work with Apache 1.3?
Thanks
-Robin
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


RE: [PHP] handling large files w/readfile

2005-01-01 Thread Robin Getz
Curt Zirzow wrote:
* Thus wrote Richard Lynch:
 Sebastian wrote:
  i'm working on a app which output files with readfile() and some 
headers..
  i read a comment in the manual that says if your outputting a file
  php will use the same amount of memory as the size of the file. so,
  if the file is 100MB php will use 100MB of memory.. is this true?

 I don't know if it's STILL true (or ever was) that readfile() would
 suck the whole file into RAM before spitting it out...  Seems real
 unlikely, but...

Never was and still isn't.
using either readfile or fpassthru is the best route.
All I know that I am hosting a GForge site, and if I leave the download.php 
code as is, I will send up with apache processes that are 200+Meg. (the 
size of my download files).
http://gforge.org/plugins/scmcvs/cvsweb.php/gforge/www/frs/download.php?rev=1.6;content-type=text%2Fplain;cvsroot=cvsroot%2Fgforge

(which uses readfile)
I have tried fpassthru - same thing.
I have even tried:
$fp = fopen($sys_upload_dir.$group_name.'/'.$filename,'rb');
while (!feof($fp)) {
   $buff = fread($fp, 4096);
   print $buff;
}
fclose ($fp);
and I get the same thing. The only thing that seems to work is:
Header(Location: .$html_pointer_to_fp);
which lets apache do the downloading.
I would do a apache_child_terminate, but the function does not seem to be 
available to me (see my previous question about this).

Any thoughts, or suggestions, I am open to try.
My next experiment is:

var $buff;
while (!feof($fp)) {
   $buff = fread($fp, 4096);
   print $buff;
}
unset($buff);
fclose ($fp);

Hopefully that will make sure that the var $buff is only created once, and 
that the memory is cleared after the function is done.

-Robin 

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


RE: [PHP] handling large files w/readfile

2005-01-01 Thread Robin Getz
Robin Getz wrote:
My next experiment is:

$buff = 0;
while (!feof($fp)) {
   $buff = fread($fp, 4096);
   print $buff;
}
unset($buff);
fclose ($fp);

Nope that doesn't work either - came back, and saw apache processes that 
were +450Meg. Changed it back to apache redirection for now.

If anyone has __any__ suggestions, I am more than happy to try. I would 
like to get this figured out.

Thanks
-Robin 

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[PHP] Apache 2.0.52 / PHP 4.3.10 Integration Question...

2004-12-31 Thread Robin Getz
Hi.
I am trying to get Apache 2.0.52 / PHP 4.3.10 working with some scripts I 
am using.

I have a file named /www/projects which is a php script.
When I type the url: www.site/projects/variable
I want variable passed to the script projects
I have the the http.conf set up as:
Files projects
  SetInputFilter  PHP
  SetOutputFilter PHP
  AcceptPathInfo  On
/Files
Which used to work with apache 2.0.40 and php 4.2.3 - but what happens now, 
is I actually get passed the php script back as text to the browser.

Any thoughts? I poked around on google, and saw at 
http://dan.drydog.com/apache2php.html

However, SetOutputFilter / SetInputFilter no longer works for me. It used 
to work with an earlier PHP 4.x or Apache 2 version, but not with Apache 
2.0.47/PHP 4.3.3. I understand this (PHP as an Apache 2 filter) is 
experimental, so I don't use it anymore

I tried things like:
  AddType text/html   php
But I keep getting the same thing in my browser:
?php
/**
 * Projects Redirector
 *
---snip---
?
Any thoughts? Thanks in advanced.
-Robin
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


RE: [PHP] Apache 2.0.52 / PHP 4.3.10 Integration Question...

2004-12-31 Thread Robin Getz
Andrew Kreps wrote:
I had to add this line to my httpd.conf:
AddType application/x-httpd-php .php
I have this and the DirectoryIndex - the problem is that my script does not 
end in a .php extention. (GForge )

If I rename the file projects.php and point to that, it works, but that 
means an entire re-write of the existing GForge.

I guess the question is - how to make a file that does not end in .php or 
have any extension, be understood as a php file?

Thanks
-Robin 

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


RE: [PHP] Apache 2.0.52 / PHP 4.3.10 Integration Question...

2004-12-31 Thread Robin Getz
The prize goes to Mark Charette of [EMAIL PROTECTED] for reading the 
apache manual in more detail than I.

After Reading
http://httpd.apache.org/docs-2.0/mod/core.html#forcetype
Then adding:
ForceType application/x-httpd-php
Things work great.
Thanks
-Robin 

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[PHP] Downloading Large (100M+) Files

2004-11-04 Thread Robin Getz
Hi.
I have searched a few of the mailing lists, and have not found an answer.
I am working on a site that is currently running gforge ( 
http://gforge.org/ ). The process that is used to download files from the 
file repository is something like:

Header('Content-disposition: filename='.str_replace('', '', $filename).'');
Header(Content-type: application/binary);
$length = filesize($sys_upload_dir.$group_name.'/'.$filename);
Header(Content-length: $length);
readfile($sys_upload_dir.$group_name.'/'.$filename);
The issue is that readfile writes it to the output buffer before sending it 
to the client. When several people try to download large files at the same 
time (The Ant Download Manager - trys downloading things by opening 20 
connections). 20 x a single 250Meg file rips through physical and swap 
pretty fast and crashes my machine.

Any thoughts on how to turn output buffering off? I have tried, but have 
not been able to get it working properly.


On a similar note, is there a portable way to determine available system 
memory (physical and swap)? Right now I am using something like:
=
# ensure there is enough free memory for the download
$free = shell_exec('free -b'); $i=0; while ( $i != strlen($free) ) {
	$i = strlen($free);
	$free = str_replace('  ',' ',$free);	
}
$free = str_replace(\n,'',$free);
$freeArray = explode(' ',$free);
$total_free = $freeArray[9] + $freeArray[18];
==

Calling shell_exec isn't very portable to other systems.
Thanks in advance.
-Robin
 

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


Re: [PHP] Downloading Large (100M+) Files

2004-11-04 Thread Robin Getz
Klaus Reimer [EMAIL PROTECTED] wrote:
If this theory is true, you may try fpassthru().
replaced:
  readfile($name);
with:
  $fp = fopen($name, 'rb');
  fpassthru($fp);
and now I don't loose 250 Meg of memory every time I download a 250Meg 
file. If someone wants to add this to the  readfile() php manual - great.

Thanks
Robin 

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


[PHP] Determining system resources

2004-11-04 Thread Robin Getz
I have been unable to find a php function to determine available system
memory (physical and swap)?
Right now I am using something like:
=
# ensure there is enough free memory for the download
$free = shell_exec('free -b'); $i=0; while ( $i != strlen($free) ) {
i = strlen($free);
free = str_replace('  ',' ',$free); 
}
$free = str_replace(\n,'',$free);
$freeArray = explode(' ',$free);
$total_free = $freeArray[9] + $freeArray[18];
==
Does anyone have any ideas that could be used on all OSes? i.e. Without 
shell_exec()?

Thanks in advance.
-Robin
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


Re: [PHP] Determining system resources

2004-11-04 Thread Robin Getz
Francisco M. Marzoa Alonso [EMAIL PROTECTED] wrote:
As far as you cannot lock another processes in the system, so this will 
not give you the security that the resources will not change -and probably 
they'll do it- while you're trying to download that file.
Yes, I understand, but not to know even if you are in the right order of 
magnitude, is kind of scary isn't it?

For example on my system, I have 1 Gig of physical memory, and 3Gig of 
swap. If I end up with only 512k free - something is very wrong, and I 
should disallow functions I know that eat up memory. There is a low 
probability that multiple connections will pass the test, and then consume 
the memory, so yes this is not 100% coverage.

Maybe what I am seeing is actually a bug in the way that readfile() handles 
low memory situations. If there is not enough memory for internal functions 
to run, they should error, not crash your system.

I assume that the way the converstation is moved is that there is not a way 
to see what free system memory is, without calling a command line function 
(free or mem).

-Robin 

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


Re: [PHP] Downloading Large (100M+) Files

2004-11-04 Thread Robin Getz
Curt Zirzow [EMAIL PROTECTED] wrote:
 replaced:
   readfile($name);
 with:
   $fp = fopen($name, 'rb');
   fpassthru($fp);
The only difference between readfile() and fpassthru() is what parameters 
you pass it.

Something else is the problem, what version of php are you running?
I am using php 4.2.2
OK - I lied.
The same problem exists with fpassthru (now that I have let it run a little 
longer) I now have 5 sleeping httpd processes on my system that are 
consuming 200Meg each.

Any thoughts? 

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php


Re: [PHP] Downloading Large (100M+) Files

2004-11-04 Thread Robin Getz
OK, I checked things out, and based on some private emails, and pointers, 
from Francisco M. Marzoa [EMAIL PROTECTED], I have now

replaced:
   readfile($name);
with:
while(!feof($fp)) {
$buf = fread($fp, 4096);
echo $buf;
$bytesSent+=strlen($buf);/* We know how many bytes were sent 
to the user */
}

I restarted apache (to free all the memory), and we will see how it goes 
overnight.

-Robin
BTW - output buffering is turned OFF. ob_get_level() returns 0, but both 
functions readfile and fpassthru seem to allocate memory (and never 
release) the size of the file.

--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php