On Sun, Jan 24, 2010 at 3:11 AM, D. Dante Lorenso <da...@lorenso.com> wrote:
> I'm loading millions of records into a backend PHP cli script that I
> need to build a hash index from to optimize key lookups for data that
> I'm importing into a MySQL database. The problem is that storing this
> data in a PHP array is not very memory efficient and my millions of
> records are consuming about 4-6 GB of ram.
What are you storing? An array of row objects??
In that case storing only the row id is will reduce the memory.
If you are loading full row objects, it will take a lot of memory.
But if you just load the row id values, it will significantly decrease
the memory amount.
Besides, You can load row ids in a chunk by chunk basis. if you have
10 millions of rows to process. load 10000 rows as a chunk. process
them then load the next chunk. This will significantly reduce memory
A good algorithm can solve your problem anytime. ;-)
My talks, http://talk.cmyweb.net
Follow me, http://twitter.com/shiplu
SUST Programmers, http://groups.google.com/group/p2psust
Innovation distinguishes bet ... ... (ask Steve Jobs the rest)
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php