If I'm doing a bulk insert of a very large number of rows is it possible to 
add only the ones that don't violate unique constraints and log the rest? 
Since I'm inserting so many rows I don't do a flush/commit after each one, 
instead I let my application batch several hundred at a time then do a 
.commit(). But if there's even 1 row that violates unique constraint none 
of the hundreds of non-duplicated rows will get updated either.

I thought about adding the rows one at a time from my application (doing a 
.commit() after each row) so that I can add only the ones that don't 
violate the constraint. But then the loading process goes extremely slowly 
because of all the database round trips and disk writes I presume. Is there 
a way to bulk load while keeping performance high?

I am currently using ORM but would use Core if it's doable that way instead.

-- 
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at http://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.

Reply via email to