Re: [HACKERS] Preventing duplicate vacuums?

2004-02-09 Thread Robert Treat
On Sat, 2004-02-07 at 02:07, Tom Lane wrote: Robert Treat [EMAIL PROTECTED] writes: Don't know if I would agree for sure, but i the second vacuum could see that it is being blocked by the current vacuum, exiting out would be a bonus, since in most scenarios you don't need to run that second

Re: [HACKERS] Preventing duplicate vacuums?

2004-02-07 Thread Tom Lane
Robert Treat [EMAIL PROTECTED] writes: Don't know if I would agree for sure, but i the second vacuum could see that it is being blocked by the current vacuum, exiting out would be a bonus, since in most scenarios you don't need to run that second vacuum so it just ends up wasting resources (or

Re: [HACKERS] Preventing duplicate vacuums?

2004-02-06 Thread Robert Treat
On Thu, 2004-02-05 at 16:51, Josh Berkus wrote: Tom, Yes we do: there's a lock. Sorry, bad test. Forget I said anything. Personally, I would like to have the 2nd vacuum error out instead of blocking. However, I'll bet that a lot of people won't agree with me. Don't know if I

Re: [HACKERS] Preventing duplicate vacuums?

2004-02-06 Thread Thomas Swan
Robert Treat wrote: On Thu, 2004-02-05 at 16:51, Josh Berkus wrote: Tom, Yes we do: there's a lock. Sorry, bad test. Forget I said anything. Personally, I would like to have the 2nd vacuum error out instead of blocking. However, I'll bet that a lot of people won't agree with

Re: [HACKERS] Preventing duplicate vacuums?

2004-02-06 Thread Joshua D. Drake
What about a situation where someone would have lazy vacuums cron'd and it takes longer to complete the vacuum than the interval between vacuums. You could wind up with an ever increasing queue of vacuums. Erroring out with a vacuum already in progress might be useful. I have seen this

[HACKERS] Preventing duplicate vacuums?

2004-02-05 Thread Josh Berkus
Folks, Just occurred to me that we have no code to prevent a user from running two simultaneos lazy vacuums on the same table.I can't think of any circumstance why running two vacuums would be desirable behavior; how difficult would it be to make this an exception? This becomes a more

Re: [HACKERS] Preventing duplicate vacuums?

2004-02-05 Thread Rod Taylor
On Thu, 2004-02-05 at 15:37, Josh Berkus wrote: Folks, Just occurred to me that we have no code to prevent a user from running two simultaneos lazy vacuums on the same table.I can't think of any circumstance why running two vacuums would be desirable behavior; how difficult would it

Re: [HACKERS] Preventing duplicate vacuums?

2004-02-05 Thread Josh Berkus
Rod, You have a 8 billion row table with some very high turn over tuples (lots of updates to a few thousand rows). A partial or targeted vacuum would be best, failing that you kick them off fairly frequently, especially if IO isn't really an issue. Yes, but we don't have partial or targeted

Re: [HACKERS] Preventing duplicate vacuums?

2004-02-05 Thread Josh Berkus
Tom, Yes we do: there's a lock. Sorry, bad test. Forget I said anything. Personally, I would like to have the 2nd vacuum error out instead of blocking. However, I'll bet that a lot of people won't agree with me. -- -Josh Berkus Aglio Database Solutions San Francisco