Hi,
I want to write a large number of model instances to DB, in a
multi-database setup. The way I am doing this is essentially:
@transaction.atomic
def process_all():
for record in Model.objects.using(DB1).all():
process_one(record)
def process_one(record):
do_something_with(record)
record.save(using=DB2)
The way I understand the docs
(https://docs.djangoproject.com/en/1.8/topics/db/transactions/#controlling-transactions-explicitly)
this should start a transaction when process_all() is entered, and commit
it when process_all() is finished - so all records should be written at
once. However, this is not what happens - no transaction is started, and
all records are written to DB immediately.
Neither ATOMIC_REQUESTS nor AUTOCOMMIT are touched in the settings. FWIW, I
use a Postgres DB and Django 1.7.8. Also, if I replace the
@transactions.atomic decorator with: with transactions.atomic() do: ...,
that doesn't change anything.
What am I missing?
Thanks, Lene
--
You received this message because you are subscribed to the Google Groups
"Django users" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/django-users.
To view this discussion on the web visit
https://groups.google.com/d/msgid/django-users/ef8c8919-6004-462b-93c4-f28fefa28e2b%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.