On 4/01/2009, at 0:16, Aaron Fulton <[email protected]> wrote:

> Perl is very good at doing this sort of processing.  Given the  
> trouble your having, I'd be inclined to look at writing the file I/O  
> and pattern matching part in perl passing the result to STDOUT  
> (which you can then be read with PHP) or write the whole thing in  
> perl.
>
> Aaron
>
>
> Michael wrote:
>>
>> I have a PHP CLI script here that reads files, usually thousands or  
>> 10's of
>> thousands in a batch, runs them through a preg_match and records  
>> results to a
>> MySQL database.
>>
>> 5,000 - 10,000 files works OK, but go further then this and it gets  
>> slower and
>> slower as it progresses through the job.
>>
>> Anyone know why?

Are you closing files after processing ? Are you freeing memory when  
you're done with it?



>>
>>
>> Is PHP not really the best language for multi text file content  
>> processing?
>> Should I be using C/C++?
>>
>> Or is there something else I should be looking at?
>>
>> The system hardware is fine for the job - Core2 Duo 2GHz with 4Gb  
>> of RAM.
>>
>> Michael
>>
>>
>>
>>
>> Internal Virus Database is out of date.
>> Checked by AVG - http://www.avg.com
>> Version: 8.0.176 / Virus Database: 270.9.19/1857 - Release Date:  
>> 12/19/2008 10:09 AM
>>
>>
>
> >

--~--~---------~--~----~------------~-------~--~----~
NZ PHP Users Group: http://groups.google.com/group/nzphpug
To post, send email to [email protected]
To unsubscribe, send email to
[email protected]
-~----------~----~----~----~------~----~------~--~---

Reply via email to