I have a PHP CLI script here that reads files, usually thousands or 10's of 
thousands in a batch, runs them through a preg_match and records results to a 
MySQL database.

5,000 - 10,000 files works OK, but go further then this and it gets slower and 
slower as it progresses through the job.

Anyone know why?

Is PHP not really the best language for multi text file content processing? 
Should I be using C/C++?

Or is there something else I should be looking at?

The system hardware is fine for the job - Core2 Duo 2GHz with 4Gb of RAM.

Michael

--~--~---------~--~----~------------~-------~--~----~
NZ PHP Users Group: http://groups.google.com/group/nzphpug
To post, send email to nzphpug@googlegroups.com
To unsubscribe, send email to
nzphpug+unsubscr...@googlegroups.com
-~----------~----~----~----~------~----~------~--~---

Reply via email to