You could have made the following change in your conf file and reload the postgresql server.
log_min_duration_statement = 10000 This will help you log the queries that take more than 10000 mili seconds to execute in your postgresql sever log file regards Gourish Singbal On 5/2/05, Enrico Weigelt <[EMAIL PROTECTED]> wrote: > > Hi folks, > > I'd like to find out which queries are most expensive (taking very > long or producing high load) in a running system, to see what > requires further optimization. (the application is quite large > and some more folks involved, so I cant check evrything manually). > > Well, the postmaster can log ev'ry single statement, but its > really too for a human person, to read the log files. > > Is there any tool for that ? > > thx > -- > --------------------------------------------------------------------- > Enrico Weigelt == metux IT service > phone: +49 36207 519931 www: http://www.metux.de/ > fax: +49 36207 519932 email: [EMAIL PROTECTED] > --------------------------------------------------------------------- > Realtime Forex/Stock Exchange trading powered by postgresSQL :)) > http://www.fxignal.net/ > --------------------------------------------------------------------- > > ---------------------------(end of broadcast)--------------------------- > TIP 2: you can get off all lists at once with the unregister command > (send "unregister YourEmailAddressHere" to [EMAIL PROTECTED]) > -- Best, Gourish Singbal ---------------------------(end of broadcast)--------------------------- TIP 5: Have you checked our extensive FAQ? http://www.postgresql.org/docs/faq
