I have a situation where I am trying to create an index of words contained in a particular table. The table is about 9,400 rows, and the index ends up being 1,500,000 words (rows). While creating the index, I do a select of the table, and extract the words. I cache these word results into an array, and use that array for a prepared insert statement into the word index table.

My problem is memory. It maxes out at about 35 MB. This is a bit high, and what I would like to do is do an onset transaction when the array reaches a certain size, like 10,000, then unset the array and continue. The problem with that is I cannot commit the insert while the fetch statement is still pending.

I have tried fetchAll instead, but still have similar memory issues.

I have also tried to commit the inserts at the end, but that causes Sqlite to hog the memory.

Is there any way to fix this, or is this just the cost of doing business with Sqlite?



PHP Database Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to