Hi Jeremy,

We have a fairly large database in our application (about 5 million rows), 
so when we're doing data migrations via a UPDATE statement, we have to do 
it in batches so that Postgres doesn't lock the table for the rest of the 
application. I was just wondering what's the best way to do it with Sequel 
and Postgres?

Currently we take pages of 1000 by an indexed ID column, but I think it 
gives Postgres more work, because it always has to find the next page "by 
hand", since it doesn't know where it last took off. I was thinking 
Postgres cursors, that I just move it forward by pages, but I'm not that 
familiar with how to use them for updates. Is it possible to do something 
like that in Sequel?

Thanks,
Janko

-- 
You received this message because you are subscribed to the Google Groups 
"sequel-talk" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/sequel-talk.
For more options, visit https://groups.google.com/d/optout.

Reply via email to