--- Bruno Wolff III <[EMAIL PROTECTED]> wrote:

> This does not belong on the pgsql-bugs list. The
> pgsql-novice or
> pgsql-performance lists seem more appropiate. I have
> set followups
> to the pgsql-novice list.
> 
> On Thu, Jun 02, 2005 at 12:05:00 +0100,
>   Bahadur Singh <[EMAIL PROTECTED]> wrote:
> > 
> > Hello,
> > 
> > I found situtation that, when I am selecting data
> from a table of 200
> > records, getting slower as I do continous update
> to the same existing data.
> 
> You need to be vacuuming (and possibly analyzing)
> the table more often as the
> updates will leave dead rows in the table which will
> bloat the table size and
> slow down access, particularly sequential scans. If
> the updates modify the
> data value distributions significantly, then you
> will also need to
> reanalyze the table to help the planner make good
> decisions.
> 

Many thanks for this tip !
But is this good idea to analyse/vacuuming the
database tables while updates are taking place..
Since, I update continuously say (100,000 ) times or
more the same data set.

This is the result of analyze command.

INFO:  analyzing "public.salesarticle"
INFO:  "salesarticle": scanned 3000 of 20850 pages,
containing 62 live rows and 134938 dead rows; 62 rows
in sample, 431 estimated total rows

Gesamtlaufzeit der Abfrage: 5531 ms.
Total Time Taken : 5531 ms.

Can you suggest me some clever way to so, because I
would prefer to do vaccumming while database is not
loaded with queries/transactions.

Regards
Bahadur



                
__________________________________ 
Discover Yahoo! 
Find restaurants, movies, travel and more fun for the weekend. Check it out! 
http://discover.yahoo.com/weekend.html 


---------------------------(end of broadcast)---------------------------
TIP 2: you can get off all lists at once with the unregister command
    (send "unregister YourEmailAddressHere" to [EMAIL PROTECTED])

Reply via email to