Hi!
I try to export ca. 5 Mio. records (with 6 float fields) from a table and want
to write it into a file with python. I query the database with the pg module.
I have 4 Gbytes of RAM but the python script crashes during getresult()
(MemoryError).

The pgdb module supports the fetchmany() method but is quite slower than the pg
module.

What do you think is the best method to extract a huge amount of data from DB
query and write it into file but using a python script? Do a loop over
fetchmany() with the pgdb module?

Bernhard
_______________________________________________
PyGreSQL mailing list
[email protected]
http://mailman.vex.net/mailman/listinfo/pygresql

Reply via email to