Dear all,
I've a loop to insert about 2K records into a postgres database (running on
same host).
This is the used code:
for row in cnv.data_matrix:
sensor_n=0
for element in row:
db.CTD_DATA.insert(CTD_STATION_ID=stationid,SENSOR=
sensor_n,VALUE=float(element))
sensor_n+=1
It takes more than 20 seconds, sometimes... also more.
I could change the database structure to reduce the number of inserts, but
there is a way to aggregate multiple insert o to improve the performances?
Thanks in advance for any suggestion.
--
---
You received this message because you are subscribed to the Google Groups
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
For more options, visit https://groups.google.com/groups/opt_out.