On 10/20/2016 12:36 PM, Aleksander Alekseev wrote:
According to my colleagues it would be very nice to have this feature.
For instance, if you are trying to optimize PostgreSQL for application
that uses COPY and you don't have access to or something like this.
It could also be useful in some other cases.
This use-case doesn't really make much sense to me.  Can you explain it
in more detail?  Is the goal here to replicate all of the statements
that are changing data in the database?
The idea is to record application workload in real environment and write
a benchmark based on this record. Then using this benchmark we could try
different OS/DBMS configuration (or maybe hardware), find an extremum,
then change configuration in production environment.

It's not always possible to change an application or even database (e.g.
to use triggers) for this purpose. For instance, if DBMS is provided as
a service.

Currently PostgreSQL allows to record all workload _except_ COPY
queries. Considering how easily it could be done I think it's wrong.
Basically the only real question here is how it should look like in
OK, how about introducing a new boolean parameter named log_copy?
Corresponding patch is attached.

This is a useful feature I was waiting for some time.
If some application which workload you want to collect is using COPY statement, then recording network traffic was your only option.

Grigory Smolkin
Postgres Professional: http://www.postgrespro.com
The Russian Postgres Company

Reply via email to