Hello!
I am updating a large number of records (~11k) with responses from an
external API.
For context, each Company has many Applicants. Applicant has a
`job_application` JSONB column. I am using Postgres 9.4.5, the
`sequel-rails` gem, and the most recent Sequel.
I'm running the below and seeing surprising behavior, so the code is:
Applicant.where(job_application: nil).count # => 11,000
company.applicants_dataset.where(job_application: nil).use_cursor.each
do |applicant|
job_application = get_applications_from_external_api(applicant)
applicant.update(job_application: job_application)
end
# Exit early after a few records update
Applicant.where(job_application: nil).count # => 11,000
I'm seeing rollback of EVERY update made to the `job_application` JSONB
column in Applicant if the block exits with an error.
My config in `application.rb` is
config.sequel.after_connect = proc do
Sequel::Model.db.extension :pg_json
Sequel.extension :pg_json_ops
Sequel::Model.plugin :timestamps, update_on_create: true
end
Feels like this whole thing is run as a transaction, but I have not set it
do do so anywhere (other than in the `spec_helper`).
I know this batch update will periodically fail due to API vagaries - any
help on saving the results if there is an error?
--
You received this message because you are subscribed to the Google Groups
"sequel-talk" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/sequel-talk.
For more options, visit https://groups.google.com/d/optout.