On Feb 12, 2007, at 12:25 PM, Andreas Jung wrote:

I have the following script to emulate a long running writing ZEO client
by writing 100MB to a page template:

import transaction

pt = app.foo
while 1:
   data = '*'*100000000
   T = transaction.begin()
   pt.pt_edit(data, 'text/html')
   T.commit()
   print 'done'

This script fails badly during during the first commit() call. Is this a bug
or feature? I am using Zope 2.10.2 on MacOSX Intel.

Based on the traceback you gave, this looks like a bug. I've noticed, however, that large database records can lead to memory errors at sizes much smaller than I would expect. If the problem is ultimately traced to a hidden memory error, there's not much that can be done. In the long run, I expect we'll advise that "large" objects be put in blobs, where "large" might be smaller than one might expect. For example, I've seen 90MB records lead to memory errors even on machines with a hundreds of megabytes free.

Jim

--
Jim Fulton                      mailto:[EMAIL PROTECTED]                Python 
Powered!
CTO                             (540) 361-1714                  
http://www.python.org
Zope Corporation        http://www.zope.com             http://www.zope.org



_______________________________________________
For more information about ZODB, see the ZODB Wiki:
http://www.zope.org/Wikis/ZODB/

ZODB-Dev mailing list  -  ZODB-Dev@zope.org
http://mail.zope.org/mailman/listinfo/zodb-dev

Reply via email to