Hi Jacob,

This is the same problem as being able to monitor the total number of
processed queries, or the total query processing time.

The only solution I'm aware of is to set pg_stat_statements.max higher than
the distinct number of normalized queries you expect.

As soon as pg_stat_statements gets above this number you lose information.
To increase max you'd need a restart, which isn't ideal but I can't see any
way round that.

It wouldn't help you much - but it would be great for monitoring if there
was a pg_stat_statements_agg table which presented total counters.

Cheers,



James Sewell,
PostgreSQL Team Lead / Solutions Architect



Suite 112, Jones Bay Wharf, 26-32 Pirrama Road, Pyrmont NSW 2009
*P *(+61) 2 8099 9000 <(+61)%202%208099%209000>  *W* www.jirotech.com  *F *
(+61) 2 8099 9099 <(+61)%202%208099%209000>

On Fri, Apr 28, 2017 at 3:33 AM, Jacob Scott <jacob.sc...@gmail.com> wrote:

> When reviewing execution statistics, I am frequently interested in the
> behavior of "classes" of queries, rather than individual queries, for
> example queries which
>
>    - Contain a join
>    - Touch a specific column
>    - Use POSIX regular expressions
>
> AFAIK this sort of summary/rollup information can't be computed reliably
> from pg_stat_statements because of rows being discarded in the face of a
> large number of (post normalization/jumble) distinct statements.
>
> Is there a way to retrieve these kinds of statistics in Postgres today?
>
> Thanks,
>
> Jacob
>

-- 

------------------------------
The contents of this email are confidential and may be subject to legal or 
professional privilege and copyright. No representation is made that this 
email is free of viruses or other defects. If you have received this 
communication in error, you may not copy or distribute any part of it or 
otherwise disclose its contents to anyone. Please advise the sender of your 
incorrect receipt of this correspondence.

Reply via email to