johnf wrote:
> On Tuesday 03 February 2009 09:32:35 am johnf wrote:
>> On Tuesday 03 February 2009 09:13:05 am Uwe Grauer wrote:
>>> johnf wrote:
>>>> I have a tuple that contain 200,000+ records.  I was using
>>>> appendDataSet to populate a bizobj and then save the bizobj.  However,
>>>> with this many records the program never seem to finish the
>>>> "appendDataSet()".  I'm guessing that I may have reached a limit.  Is
>>>> there a size limit to a Dabo cursor?  Is there a way around the issue.
>>> http://www.fullduplex.org/humor/2006/10/how-to-shoot-yourself-in-the-foot
>>> -i n-any-programming-language/
>>>
>>> ;-)
>>>
>>> Uwe
>> LOL - :-)
> 
> Just an update.  I changed the way I was dealing with the data (use direct 
> inserts) and the 217,000+ records took 48 mins.  Maybe if I had waited longer 
> it might have worked - but I can't confirm.
> 

This is about 75 records per second.
So i assume you used Dabo for this.
If you need to insert that many records more often,
i would try to use prepare and executemany directly with pythons dbapi.
You can get higher throughput by using more than one connection.
Also try to commit more often.

Uwe

_______________________________________________
Post Messages to: [email protected]
Subscription Maintenance: http://leafe.com/mailman/listinfo/dabo-users
Searchable Archives: http://leafe.com/archives/search/dabo-users
This message: http://leafe.com/archives/byMID/[email protected]

Reply via email to