php-windows Digest 1 Jun 2007 11:39:15 -0000 Issue 3248
Topics (messages 28030 through 28033):
Problems with working with large text files
28030 by: Adam Niedzwiedzki
28031 by: Stut
28032 by: Adam Niedzwiedzki
PHP and batch-files
28033 by: Gustav Wiberg
Administrivia:
To subscribe to the digest, e-mail:
[EMAIL PROTECTED]
To unsubscribe from the digest, e-mail:
[EMAIL PROTECTED]
To post to the list, e-mail:
[EMAIL PROTECTED]
----------------------------------------------------------------------
--- Begin Message ---
Hi all,
I have a simple php script that I'm running from command line, it opens up a
http web log, proccess's it, then zips it when done.
If the http log is under 200MB (approx) this all hum's along nicely, as soon
as the files are up over 300MB php falls over.
Fatal error: Out of memory (allocated 378535936) (tried to allocate
381131220 bytes)
I'm running php5.2.2 on Windows 2003 64Bit Enterprise.
I have my php.ini memory_limit set to -1 and in my scripts I set the
following
;;;;;;;;;;;;;;;;;;;
; Resource Limits ;
;;;;;;;;;;;;;;;;;;;
max_execution_time = 30 ; Maximum execution time of each script, in seconds
max_input_time = 60 ; Maximum amount of time each script may spend
parsing request data
memory_limit = -1 ; Maximum amount of memory a script may
consume (128MB)
I have this in inline code..
ini_set("memory_limit",-1);
set_time_limit(0);
It seems to fall over on either fopen() or on gzcompress() or both if the
file is over 300MB.
Anyone know of another option to tell php to just be unlimtied on it's ram
usage?
The machine it's runnning on is an 8GB machine, has over 3GB free. (it's a
quad opteron box).
Anyone have any clues to help me out :(
Cheers
Ad
--- End Message ---
--- Begin Message ---
Adam Niedzwiedzki wrote:
I have a simple php script that I'm running from command line, it opens up a
http web log, proccess's it, then zips it when done.
If the http log is under 200MB (approx) this all hum's along nicely, as soon
as the files are up over 300MB php falls over.
Fatal error: Out of memory (allocated 378535936) (tried to allocate
381131220 bytes)
I'm running php5.2.2 on Windows 2003 64Bit Enterprise.
I have my php.ini memory_limit set to -1 and in my scripts I set the
following
;;;;;;;;;;;;;;;;;;;
; Resource Limits ;
;;;;;;;;;;;;;;;;;;;
max_execution_time = 30 ; Maximum execution time of each script, in seconds
max_input_time = 60 ; Maximum amount of time each script may spend
parsing request data
memory_limit = -1 ; Maximum amount of memory a script may
consume (128MB)
I have this in inline code..
ini_set("memory_limit",-1);
set_time_limit(0);
It seems to fall over on either fopen() or on gzcompress() or both if the
file is over 300MB.
Anyone know of another option to tell php to just be unlimtied on it's ram
usage?
The machine it's runnning on is an 8GB machine, has over 3GB free. (it's a
quad opteron box).
Anyone have any clues to help me out :(
Yeah, don't load the whole frickin' log into memory at the same time.
Refactor your code so it can process the log line by line and you'll
save yourself many many headaches in the future. I've never come across
a good reason to load a large file into memory all at just to "process" it.
-Stut
--- End Message ---
--- Begin Message ---
Hi Stut,
(yeah ok.... But still doesn't explain why php ain't letting me (I'm
thinking BUG) :P)
Anyways this is how I'm handling the file...
if($fp = fopen($logfile, 'r')){
debug_log("$file has ".count(file($logfile))." lines to process");
while(!feof($fp)){
$line = fgets($fp);
You just made me relise I'm calling file() for the line count (That's a big
hit), any other way of doing it?
And YES I want a line count BEFORE I start looping through it...
But I can't read line for line to do the gzcompress I have to load the whole
file up to compress it don't I?
gzcompress ($data, 9)
$data being the string (the whole file) of text I need to compress.
Cheers
Ad
-----Original Message-----
From: Stut [mailto:[EMAIL PROTECTED]
Sent: Friday, 1 June 2007 12:30 PM
To: Adam Niedzwiedzki
Cc: [EMAIL PROTECTED]
Subject: Re: [PHP-WIN] Problems with working with large text files
Adam Niedzwiedzki wrote:
> I have a simple php script that I'm running from command line, it
> opens up a http web log, proccess's it, then zips it when done.
> If the http log is under 200MB (approx) this all hum's along nicely,
> as soon as the files are up over 300MB php falls over.
>
> Fatal error: Out of memory (allocated 378535936) (tried to allocate
> 381131220 bytes) I'm running php5.2.2 on Windows 2003 64Bit
> Enterprise.
> I have my php.ini memory_limit set to -1 and in my scripts I set the
> following
>
> ;;;;;;;;;;;;;;;;;;;
> ; Resource Limits ;
> ;;;;;;;;;;;;;;;;;;;
>
> max_execution_time = 30 ; Maximum execution time of each script, in
seconds
> max_input_time = 60 ; Maximum amount of time each script may spend
> parsing request data
> memory_limit = -1 ; Maximum amount of memory a script may
> consume (128MB)
>
> I have this in inline code..
>
> ini_set("memory_limit",-1);
> set_time_limit(0);
>
> It seems to fall over on either fopen() or on gzcompress() or both if
> the file is over 300MB.
> Anyone know of another option to tell php to just be unlimtied on it's
> ram usage?
> The machine it's runnning on is an 8GB machine, has over 3GB free.
> (it's a quad opteron box).
>
> Anyone have any clues to help me out :(
Yeah, don't load the whole frickin' log into memory at the same time.
Refactor your code so it can process the log line by line and you'll save
yourself many many headaches in the future. I've never come across a good
reason to load a large file into memory all at just to "process" it.
-Stut
--- End Message ---
--- Begin Message ---
Hi there!
I want to use PHP to run as a "batch-file" on the server every night. What are
the alternatives for doing this.
I'm using II6 and Win2003 Server, and PHP as an ISAPI-module. (vers 5.2.2)
Best regards
Gustav Wiberg
--- End Message ---