On Feb 3, 2009, at 10:16 AM, johnf wrote:

> I have a tuple that contain 200,000+ records.  I was using  
> appendDataSet to
> populate a bizobj and then save the bizobj.  However, with this many  
> records
> the program never seem to finish the "appendDataSet()".  I'm  
> guessing that I
> may have reached a limit.  Is there a size limit to a Dabo cursor?   
> Is there
> a way around the issue.


        There are no built-in size limits, but you are probably reaching the  
limits of free RAM on your machine. You have a 200K-element tuple,  
with each element (I assume) having multiple values in its dict.  
appendDataSet() now makes another copy of that tuple, along with 200K  
mementos to track these new records.

        In general when using large tuples/lists/dicts, you want to avoid  
making unnecessary copies. I would iterate over the set in, say ,  
slices of 100 or maybe 1000 items. Append them, save, clear the  
cursor, and repeat.


-- Ed Leafe




_______________________________________________
Post Messages to: [email protected]
Subscription Maintenance: http://leafe.com/mailman/listinfo/dabo-users
Searchable Archives: http://leafe.com/archives/search/dabo-users
This message: 
http://leafe.com/archives/byMID/[email protected]

Reply via email to