johnf wrote:
> On Tuesday 03 February 2009 08:19:00 am Paul McNett wrote:
>> johnf wrote:
>>> I have a tuple that contain 200,000+ records.  I was using appendDataSet
>>> to populate a bizobj and then save the bizobj.  However, with this many
>>> records the program never seem to finish the "appendDataSet()".  I'm
>>> guessing that I may have reached a limit.  Is there a size limit to a
>>> Dabo cursor?  Is there a way around the issue.
>> There aren't any limits save that of memory. Does the disk start thrashing?
>> If you monitor the process do you stop seeing activity?
>>
>> Paul
> I'm running top and CPU goes to 100% for the python process and just stays 
> there. I've let it run for over an hour easy.   

Well, how long does it take to append 1,000 records? Is more than an hour out 
of the 
realm of possibility? Can you do something like the following diff which would 
tell 
us if it is actually hung or just going slower than expected?

{{{
Index: dCursorMixin.py
===================================================================
--- dCursorMixin.py     (revision 5017)
+++ dCursorMixin.py     (working copy)
@@ -1124,7 +1124,8 @@
                kf = self.KeyField
                if not isinstance(kf, tuple):
                        kf = (kf, )
-               for rec in ds:
+               for idx, rec in enumerate(ds):
+                       print idx
                        self.new()
                        for col, val in rec.items():
                                if self.AutoPopulatePK and (col in kf):

}}}

Paul

_______________________________________________
Post Messages to: [email protected]
Subscription Maintenance: http://leafe.com/mailman/listinfo/dabo-users
Searchable Archives: http://leafe.com/archives/search/dabo-users
This message: http://leafe.com/archives/byMID/[email protected]

Reply via email to