Re: [PERFORM] BUG #1697: Select getting slower on continously updating data

2005-06-05 Thread Bahadur Singh


--- Bruno Wolff III <[EMAIL PROTECTED]> wrote:

> This does not belong on the pgsql-bugs list. The
> pgsql-novice or
> pgsql-performance lists seem more appropiate. I have
> set followups
> to the pgsql-novice list.
> 
> On Thu, Jun 02, 2005 at 12:05:00 +0100,
>   Bahadur Singh <[EMAIL PROTECTED]> wrote:
> > 
> > Hello,
> > 
> > I found situtation that, when I am selecting data
> from a table of 200
> > records, getting slower as I do continous update
> to the same existing data.
> 
> You need to be vacuuming (and possibly analyzing)
> the table more often as the
> updates will leave dead rows in the table which will
> bloat the table size and
> slow down access, particularly sequential scans. If
> the updates modify the
> data value distributions significantly, then you
> will also need to
> reanalyze the table to help the planner make good
> decisions.
> 

Many thanks for this tip !
But is this good idea to analyse/vacuuming the
database tables while updates are taking place..
Since, I update continuously say (100,000 ) times or
more the same data set.

This is the result of analyze command.

INFO:  analyzing "public.salesarticle"
INFO:  "salesarticle": scanned 3000 of 20850 pages,
containing 62 live rows and 134938 dead rows; 62 rows
in sample, 431 estimated total rows

Gesamtlaufzeit der Abfrage: 5531 ms.
Total Time Taken : 5531 ms.

Can you suggest me some clever way to so, because I
would prefer to do vaccumming while database is not
loaded with queries/transactions.

Regards
Bahadur




__ 
Discover Yahoo! 
Find restaurants, movies, travel and more fun for the weekend. Check it out! 
http://discover.yahoo.com/weekend.html 


---(end of broadcast)---
TIP 2: you can get off all lists at once with the unregister command
(send "unregister YourEmailAddressHere" to [EMAIL PROTECTED])


Re: [PERFORM] BUG #1697: Select getting slower on continously updating data

2005-06-03 Thread Bruno Wolff III
On Fri, Jun 03, 2005 at 00:09:00 -0700,
  Bahadur Singh <[EMAIL PROTECTED]> wrote:
> 
> Many thanks for this tip !
> But is this good idea to analyse/vacuuming the
> database tables while updates are taking place..
> Since, I update continuously say (100,000 ) times or
> more the same data set.
> 
> This is the result of analyze command.
> 
> INFO:  analyzing "public.salesarticle"
> INFO:  "salesarticle": scanned 3000 of 20850 pages,
> containing 62 live rows and 134938 dead rows; 62 rows
> in sample, 431 estimated total rows
> 
> Gesamtlaufzeit der Abfrage: 5531 ms.
> Total Time Taken : 5531 ms.
> 
> Can you suggest me some clever way to so, because I
> would prefer to do vaccumming while database is not
> loaded with queries/transactions.

While that may be a nice preference, under your usage pattern that does
not appear to be a good idea. As long as your disk I/O isn't saturated
you want to be running vacuums a lot more often than you are. (Analyze should
only be needed if the distrution of values is changing constantly. An example
would be timestamps indicating when an update occured.)

---(end of broadcast)---
TIP 8: explain analyze is your friend