Hello,

While looking for a good way to implement a bulk update to numerous rows of 
the same table, I came across this performance example 
<https://github.com/zzzeek/sqlalchemy/blob/master/examples/performance/bulk_updates.py>
:

    session = Session(bind=engine)
    for chunk in range(0, n, 1000):
        customers = session.query(Customer).\
            filter(Customer.id.between(chunk, chunk + 1000)).all()
        for customer in customers:
            customer.description += "updated"
        session.flush()
    session.commit()

I noticed that the flush() happens with every iteration over the chunks, 
instead of the end, right along with commit(). Why is that?

And is the above the recommended way of a bulk update, instead of e.g. 
calling Table.update() 
<http://docs.sqlalchemy.org/en/latest/core/metadata.html#sqlalchemy.schema.Table.update>
 
as recommended in this SO answer 
<https://stackoverflow.com/questions/25694234/bulk-update-in-sqlalchemy-core-using-where#25720751>
?

Thank you!
Jens

-- 
SQLAlchemy - 
The Python SQL Toolkit and Object Relational Mapper

http://www.sqlalchemy.org/

To post example code, please provide an MCVE: Minimal, Complete, and Verifiable 
Example.  See  http://stackoverflow.com/help/mcve for a full description.
--- 
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at https://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.

Reply via email to