I'm experiencing an unfashionable amount of anticipation for hearing
tomorrow what performance problems will be introduced by adding this
extra component into the loop for 50,000 iterations.
I mean 50,000,000 iterations. Approximately.
--~--~-~--~~~---~--~~
NZ
Whats your memory_limit set to in php.ini - looks like 768MB from your
error message.
You might want to set it to -1 which is no memory limit and see what happens..
Try phpinfo as well.
Matt
On Wed, Jan 7, 2009 at 1:10 PM, Michael mich...@networkstuff.co.nz wrote:
Now that I have the other
which is line 124
On Wed, Jan 7, 2009 at 1:26 PM, Michael mich...@networkstuff.co.nz wrote:
On Wed, 07 Jan 2009 13:21:51 Matthew White wrote:
Whats your memory_limit set to in php.ini - looks like 768MB from your
error message.
You might want to set it to -1 which is no memory limit and
You need to parse large text files in chunks. I think someone on your
previous thread outlined a promising strategy for this.
--~--~-~--~~~---~--~~
NZ PHP Users Group: http://groups.google.com/group/nzphpug
To post, send email to nzphpug@googlegroups.com
To
$fh = fopen('data.txt');
while ($line = fgets($fh)) {
preg_match(...,$line); ...
}
fclose($fh);
far as i know this chunks the text file on a per line basis. if not
fopen('data.txt',4096); chunks its
On Jan 7, 1:10 pm, Michael mich...@networkstuff.co.nz wrote:
Now that I have the other
On Wed, 07 Jan 2009 15:41:44 Steve Boyd wrote:
$fh = fopen('data.txt');
while ($line = fgets($fh)) {
preg_match(...,$line); ...
}
fclose($fh);
The issue is with this code, is while it reads the whole file, the array out
put of preg_match_all is empty. (note that preg_match also returns