Hello Anna,

Tuesday, May 11, 2004, 1:18:48 PM, you wrote:

A> What techniques are there to optimise the use of very large arrays?  I have
A> an array of over 380K keys and values that is being processed incrementally,
A> however the longer it runs the more time is seems to be taking - an array of
A> 100k items is taking 1 hour to process, and array of 400k items is taking 9
A> hours to process.

A> I'm chunking up the array into smaller arrays, but that doesn't seem to be
A> making much difference.

Err, stick more memory and a faster CPU in your server!

If you absolutely must have an array that large (and I'd love to know
why you do) then it's going to have to be held in memory by PHP for
the entire duration of the script. 380,000 items even at a few KB each
is hell of a lot of overhead. I'm impressed PHP managed it, let alone
at 9 hours to process :)

-- 
Best regards,
 Richard Davey
 http://www.phpcommunity.org/wiki/296.html

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to