I have a database that is just under 200K objects with plans that it will grow larger. The objects are created by using another object with methods that read files and create the objects from the data in the files. When creating the objects, I find I have to manually clear the cache or else zope eventually crashes with an out of memory error. There are 4GB of ram on the server. The same issue arises when I catalog the objects. I am trying maintain as much speed as possible, so I do not want to catalog until the end of the import.

Is there a way to programmatically clear the cache during a large indexing to avoid out of memory errors? I am using zeo storages and setting the cache on those to a smaller size has no real effect in this case, because the cache seems to outgrow its defined size when it is very active.

Is zope even good for this? Should I consider another database? It still seems that cataloging is an issue whether
the objects are stored in the zodb or elsewhere.

-Kevin

_______________________________________________
Zope maillist  -  Zope@zope.org
http://mail.zope.org/mailman/listinfo/zope
**   No cross posts or HTML encoding!  **
(Related lists - http://mail.zope.org/mailman/listinfo/zope-announce
http://mail.zope.org/mailman/listinfo/zope-dev )

Reply via email to