From: Laurenz Albe <laurenz.a...@cybertec.at>
Sent: Tuesday, February 20, 2024 8:29 AM
>Re: "not related" code blocks for removal of dead rows when using vacuum and 
>this kills the performance
>Laurenz Albe <laurenz.a...@cybertec.at>
>​Lars Aksel Opsahl;​
>pgsql-performance@lists.postgresql.org​
>On Tue, 2024-02-20 at 05:46 +0000, Lars Aksel Opsahl wrote:
>> If this is expected behavior it means that any user on the database that 
>> writes
>> a long running sql that does not even insert any data can kill performance 
>> for
>> any other user in the database.
>
>Yes, that is the case.  A long running query will hold a snapshot, and no data
>visible in that snapshot can be deleted.
>
>That can cause bloat, which can impact performance.
>

Hi

Thanks for the chat, seems like I finally found solution that seems work for 
this test code.

Adding a commit's  like here 
/uploads/031b350bc1f65752b013ee4ae5ae64a3/test_issue_67_with_commit.sql to 
master code even if there are nothing to commit seems to solve problem and that 
makes sense based on what you say, because then the master code gets a new 
visible snapshot and then releases the old snapshot.

The reason why I like to use psql as the master/Orchestration code and not 
C/Python/Bash and so is to make more simple to use/code and test.

Lars

Reply via email to