Re: Help understanding server-side logging (and more...)

2023-12-21 Thread Adrian Klaver

On 12/21/23 03:05, Dominique Devienne wrote:

If by any chance, anyone has seen this kind of behavior before, in a 
similar context or not, and has suggestions
on how we could further troubleshoot this (on the PostgreSQL 
server-side, but also the Node.JS client-side, why not),

that would be greatly appreciated.


1) It would be helpful to add the Postgres version to any future posts.

2) Also more information on what this "They run PostgreSQL in a Linux 
container" means?




Thanks, --DD



--
Adrian Klaver
adrian.kla...@aklaver.com





Re: Help understanding server-side logging (and more...)

2023-12-21 Thread Peter J. Holzer
On 2023-12-21 12:05:41 +0100, Dominique Devienne wrote:
> FWIW, the symptoms they are experiencing is that initially all queries
> are fast, then they start to get longer and longer, to the point all
> services accessing PostgreSQL grind to a halt and basically hang.
> While that happens, one can connect to the DB from a shell (that
> cluster has a single DB) w/o issues, and run queries just fine

If you do that, do you see the "hanging" queries in pg_stat_activity? If
so, what are they waiting for?

hp

-- 
   _  | Peter J. Holzer| Story must make more sense than reality.
|_|_) ||
| |   | h...@hjp.at |-- Charles Stross, "Creative writing
__/   | http://www.hjp.at/ |   challenge!"


signature.asc
Description: PGP signature


Re: Help understanding server-side logging (and more...)

2023-12-21 Thread Julien Rouhaud
Hi,

On Thu, Dec 21, 2023 at 12:05:41PM +0100, Dominique Devienne wrote:
> Hi. Another team (than mine) has been trying to troubleshoot hang issues in
> their Node/JS/TS-based mid-tier services,
> after a switch from MySQL to PostgreSQL. They run PostgreSQL in a Linux
> container (see PID [1] below), unlike us.
> They started looking at the PostgreSQL server-side logs, but with little to
> no experience reading them, we'd like to
> confirm the extract below looks normal, notably whether the various
> exit-related messages are OK or not.

All those messages are at DEBUG level.  Anything abnormal would be emitted at a
higher level.
>
>
> FWIW, the symptoms they are experiencing is that initially all queries are
> fast, then they start to get longer and longer,
> to the point all services accessing PostgreSQL grind to a halt and
> basically hang. While that happens, one can connect
> to the DB from a shell (that cluster has a single DB) w/o issues, and run
> queries just fine (they have little data,
> in the few dozens MBs spread in 2 schemas and probably 2 or 3 dozen tables
> max), thus I personally think the problem
> is more on the client Node.JS side, but given that the same code worked
> fine with MySQL, maybe I'm wrong.

Yeah, "everything works normally except inside the app" is usually a good
indication that it's unlikely to be a postgres problem.  Maybe they have some
locking issue, transactions that are opened for too long or something else but
it's impossible to say without much more information.  One sure thing is that
running with logs at DEBUG level is going to add noticeable overhead.




Help understanding server-side logging (and more...)

2023-12-21 Thread Dominique Devienne
Hi. Another team (than mine) has been trying to troubleshoot hang issues in
their Node/JS/TS-based mid-tier services,
after a switch from MySQL to PostgreSQL. They run PostgreSQL in a Linux
container (see PID [1] below), unlike us.
They started looking at the PostgreSQL server-side logs, but with little to
no experience reading them, we'd like to
confirm the extract below looks normal, notably whether the various
exit-related messages are OK or not.

I'm guessing it's fine/normal, especially since they use a Node PostgreSQL
module with Pooling (10 connections max by default)
that disconnects idle connections after 10s, which matches what the log
shows. Can anyone please confirm?

FWIW, the symptoms they are experiencing is that initially all queries are
fast, then they start to get longer and longer,
to the point all services accessing PostgreSQL grind to a halt and
basically hang. While that happens, one can connect
to the DB from a shell (that cluster has a single DB) w/o issues, and run
queries just fine (they have little data,
in the few dozens MBs spread in 2 schemas and probably 2 or 3 dozen tables
max), thus I personally think the problem
is more on the client Node.JS side, but given that the same code worked
fine with MySQL, maybe I'm wrong.

Their services are concurrent and accessed by maybe 50-100 users tops, with
queries coming in all the time.

If by any chance, anyone has seen this kind of behavior before, in a
similar context or not, and has suggestions
on how we could further troubleshoot this (on the PostgreSQL server-side,
but also the Node.JS client-side, why not),
that would be greatly appreciated.

Thanks, --DD

2023-12-20 00:00:16.185 UTC [3605] DEBUG:  InitPostgres
2023-12-20 00:00:16.185 UTC [3605] DEBUG:  my backend ID is 4
2023-12-20 00:00:16.186 UTC [1] DEBUG:  forked new backend, pid=3606
socket=10
2023-12-20 00:00:16.187 UTC [3606] DEBUG:  InitPostgres
2023-12-20 00:00:16.187 UTC [3606] DEBUG:  my backend ID is 6
2023-12-20 00:00:16.188 UTC [3605] DEBUG:  StartTransaction(1) name:
unnamed; blockState: DEFAULT; state: INPROGRESS, xid/subid/cid: 0/1/0
2023-12-20 00:00:16.189 UTC [3605] DEBUG:  processing received SASL
response of length 51
2023-12-20 00:00:16.189 UTC [3605] DEBUG:  sending SASL challenge of length
84
2023-12-20 00:00:16.190 UTC [3606] DEBUG:  StartTransaction(1) name:
unnamed; blockState: DEFAULT; state: INPROGRESS, xid/subid/cid: 0/1/0
2023-12-20 00:00:16.191 UTC [3606] DEBUG:  processing received SASL
response of length 51
2023-12-20 00:00:16.191 UTC [3606] DEBUG:  sending SASL challenge of length
84
2023-12-20 00:00:16.193 UTC [3605] DEBUG:  processing received SASL
response of length 104
2023-12-20 00:00:16.193 UTC [3605] DEBUG:  sending SASL challenge of length
46
2023-12-20 00:00:16.195 UTC [3606] DEBUG:  processing received SASL
response of length 104
2023-12-20 00:00:16.195 UTC [3606] DEBUG:  sending SASL challenge of length
46
2023-12-20 00:00:16.200 UTC [3605] DEBUG:  CommitTransaction(1) name:
unnamed; blockState: STARTED; state: INPROGRESS, xid/subid/cid: 0/1/0
2023-12-20 00:00:16.201 UTC [3605] DEBUG:  StartTransaction(1) name:
unnamed; blockState: DEFAULT; state: INPROGRESS, xid/subid/cid: 0/1/0
2023-12-20 00:00:16.201 UTC [3605] STATEMENT:  SELECT * FROM "NSP"."T1"
2023-12-20 00:00:16.202 UTC [3606] DEBUG:  CommitTransaction(1) name:
unnamed; blockState: STARTED; state: INPROGRESS, xid/subid/cid: 0/1/0
2023-12-20 00:00:16.203 UTC [3606] DEBUG:  StartTransaction(1) name:
unnamed; blockState: DEFAULT; state: INPROGRESS, xid/subid/cid: 0/1/0
2023-12-20 00:00:16.203 UTC [3606] STATEMENT:  SELECT * FROM "NSP"."T2"
2023-12-20 00:00:16.205 UTC [3605] DEBUG:  CommitTransaction(1) name:
unnamed; blockState: STARTED; state: INPROGRESS, xid/subid/cid: 0/1/0
2023-12-20 00:00:16.205 UTC [3605] STATEMENT:  SELECT * FROM "NSP"."T1"
2023-12-20 00:00:16.207 UTC [3606] DEBUG:  CommitTransaction(1) name:
unnamed; blockState: STARTED; state: INPROGRESS, xid/subid/cid: 0/1/0
2023-12-20 00:00:16.207 UTC [3606] STATEMENT:  SELECT * FROM "NSP"."T2"
2023-12-20 00:00:26.209 UTC [3605] DEBUG:  shmem_exit(0): 4
before_shmem_exit callbacks to make
2023-12-20 00:00:26.209 UTC [3605] DEBUG:  shmem_exit(0): 6 on_shmem_exit
callbacks to make
2023-12-20 00:00:26.209 UTC [3606] DEBUG:  shmem_exit(0): 4
before_shmem_exit callbacks to make
2023-12-20 00:00:26.209 UTC [3605] DEBUG:  proc_exit(0): 2 callbacks to make
2023-12-20 00:00:26.209 UTC [3605] DEBUG:  exit(0)
2023-12-20 00:00:26.209 UTC [3605] DEBUG:  shmem_exit(-1): 0
before_shmem_exit callbacks to make
2023-12-20 00:00:26.209 UTC [3605] DEBUG:  shmem_exit(-1): 0 on_shmem_exit
callbacks to make
2023-12-20 00:00:26.209 UTC [3605] DEBUG:  proc_exit(-1): 0 callbacks to
make
2023-12-20 00:00:26.209 UTC [3606] DEBUG:  shmem_exit(0): 6 on_shmem_exit
callbacks to make
2023-12-20 00:00:26.209 UTC [3606] DEBUG:  proc_exit(0): 2 callbacks to make
2023-12-20 00:00:26.209 UTC [3606] DEBUG:  exit(0)
2023-12-20 00:00:26.209 UTC [3606]