can you elaborate on how much data is being loaded and what performance 
you're targeting ?

if you're concerned with loading many MB of data as periodic batches, the 
best performance by far is going to be generating a text file in one of the 
formats your database server natively supports, and using a commandline 
client to load it into the server.

i'm not familiar with how mysql handles foreign key checks, but in 
postgresql deferring the constraint check OR dropping and rebuilding the 
constraint will make things run even faster.

-- 
SQLAlchemy - 
The Python SQL Toolkit and Object Relational Mapper

http://www.sqlalchemy.org/

To post example code, please provide an MCVE: Minimal, Complete, and Verifiable 
Example.  See  http://stackoverflow.com/help/mcve for a full description.
--- 
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.

Reply via email to