I didn't know that as well, Is there a limit in the number of dicts passed 
to the bulk_insert?

Paolo

On Monday, May 20, 2013 11:18:51 AM UTC+2, Niphlod wrote:
>
> there's a bulk_insert method too.
>
> http://web2py.com/books/default/chapter/29/06?search=bulk
>
> Il giorno lunedì 20 maggio 2013 10:36:32 UTC+2, Rocco ha scritto:
>>
>> Dear all,
>>
>> I've a loop to insert about 2K records into a postgres database (running 
>> on same host).
>>
>> This is the used code:
>>
>>             for row in cnv.data_matrix:
>>                sensor_n=0
>>                for element in row:
>>                   db.CTD_DATA.insert(CTD_STATION_ID=stationid,SENSOR=
>> sensor_n,VALUE=float(element))
>>                   sensor_n+=1
>>
>>
>> It takes more than 20 seconds, sometimes... also more.
>> I could change the database structure to reduce the number of inserts, 
>> but there is a way to aggregate multiple insert o to improve the 
>> performances?
>> Thanks in advance for any suggestion.
>>
>

-- 

--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/groups/opt_out.


Reply via email to