> Within the loop I'd focus on the elapsed time for the call to
> preg_match_all() - regular expression matching is notorious for being
> slow, and you might have a particularly slow pattern/data combination
> going on.
>
> The other potential slowdown will be in reading the data from disk.
> Determine how long each file read takes (again using microtime()) and
> if there are spikes there you could try moving the data to another
> disk, or into memory if you have the RAM.
>
> While you're at it you should also collect and report memory usage
> (delta for each iteration and total) using memory_get_usage().  What's
> your total memory usage when things start slowing down?

Thank you :-)

I am currently trialing file_get_contents()

It is very unlikely to be the PCRE regex as it goes fast enough on smaller 
datasets (that are other wise similar in size per file and style).

It also isn't the MIME type detection. I have ruled that out.

The problem IMHO is definitely in the file reading.

The reason is that if the large problematic batches did the whole batch at the 
same speed as the first 10-20%, I'd be happy with that for now and leave any 
further tweaks for another time (If ever)...

Kind regards,

Michael

--~--~---------~--~----~------------~-------~--~----~
NZ PHP Users Group: http://groups.google.com/group/nzphpug
To post, send email to nzphpug@googlegroups.com
To unsubscribe, send email to
nzphpug+unsubscr...@googlegroups.com
-~----------~----~----~----~------~----~------~--~---

Reply via email to