On Tue, Apr 26, 2011 at 11:55 PM, Josh Berkus <j...@agliodbs.com> wrote: > Here's where I wanted autonomous transactions just last week, and didn't > have them so I had to use a python script outside the database: > > -- doing a CREATE INDEX CONCURRENTLY for 150 partitions on a partitioned > table. > > -- doing a backfill operation for 10GB of computed data, taking 8 hours, > where I don't want to hold a transaction open for 8 hours since this is > a high-volume OLTP database.
These don't seem like compelling use cases at all to me. You said you had to fall back to using a python script outside the database, but what disadvantage does that have? Why is moving your application logic into the database an improvement? Honestly in every case where I've had to move code that had been in a function to the application I've found there were tons of benefits. Everything from being able to better control the behaviour, to being able to parallelize the processing over multiple connections, being able to run parts of it at different times, being able to see the progress and control it from another session, being able to manage the code in version control, the list just goes on. Trying to move all the code into the database just makes life harder. Autonomous transactions have value on their own. But it's not so that you can run create index ocncurrently or vacuum or whatever. They're useful so that a single session can do things like log errors even when a transaction rolls back. Actually that's the only example I can think of but it's a pretty good use case on its own and I'm sure it's not entirely unique. -- greg -- Sent via pgsql-hackers mailing list (pgsql-hackers@postgresql.org) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-hackers