PHP can be a bit leaky in my experience.

In similar tasks, I've had a 'monitor' php script spawn another
process which does a couple of thousand files and then dies. The
monitor script then launches another process etc etc.

Doing it this way you can also have the monitor script launch multiple
processes at a time which can speed things up considerably...


YMMV.

Matt



On Fri, Jan 2, 2009 at 11:25 AM, Michael <[email protected]> wrote:
>
> I have a PHP CLI script here that reads files, usually thousands or 10's of
> thousands in a batch, runs them through a preg_match and records results to a
> MySQL database.
>
> 5,000 - 10,000 files works OK, but go further then this and it gets slower and
> slower as it progresses through the job.
>
> Anyone know why?
>
> Is PHP not really the best language for multi text file content processing?
> Should I be using C/C++?
>
> Or is there something else I should be looking at?
>
> The system hardware is fine for the job - Core2 Duo 2GHz with 4Gb of RAM.
>
> Michael
>
> >
>

--~--~---------~--~----~------------~-------~--~----~
NZ PHP Users Group: http://groups.google.com/group/nzphpug
To post, send email to [email protected]
To unsubscribe, send email to
[email protected]
-~----------~----~----~----~------~----~------~--~---

Reply via email to