I wrote a simple script to move data from a ZODB database to sqlite.
 My data is in a OOBTree, I chose this because supposedly you can bring the
buckets to memory one at a time....

So what I am doing is basically this: I iterate over my objects and write
them one-by-one to the other db.
for k,u in user_root['userdb'].items(): # I have tried iteritems() here too
but the results are the same....
    # write the data to sqlite

My problem is that my database is big and  as the memory consumption
increases as the loop progresses until all my memory is exhauted and the OS
goes into swap. It seems that the objects from previous iterations are not
been cleared from memory.
Am I doing something wrong, or is it impossible to iterate over a ZODB
database which is bigger than you memory?


Flávio Codeço Coelho
"My grandfather once told me that there were two kinds of people: those who
do the work and those who take the credit. He told me to try to be in the
first group; there was much less competition."
Indira Gandhi
registered Linux user # 386432
get counted at http://counter.li.org
For more information about ZODB, see the ZODB Wiki:

ZODB-Dev mailing list  -  ZODB-Dev@zope.org

Reply via email to