Re: Selecting RAM and CPU based on max_connections

2022-05-20 Thread Ganesh Korde
You may also need to tune shmmax and shmmin kernel parameters.

Regards,
Ganesh Korde.

On Fri, 20 May 2022, 1:58 pm aditya desai,  wrote:

> Hi,
> One of our applications needs 3000 max_connections to the database.
> Connection pooler like pgbouncer or pgpool is not certified within the
> organization yet. So they are looking for setting up high configuration
> Hardware with CPU and Memory. Can someone advise how much memory and CPU
> they will need if they want max_conenction value=3000.
>
> Regards,
> Aditya.
>


Re: Selecting RAM and CPU based on max_connections

2022-05-20 Thread aditya desai
Thanks! I will run these suggestions with App team.

On Fri, May 20, 2022 at 4:01 PM Laurenz Albe 
wrote:

> On Fri, 2022-05-20 at 12:15 +0200, Andreas Kretschmer wrote:
> > On 20 May 2022 10:27:50 CEST, aditya desai  wrote:
> > > One of our applications needs 3000 max_connections to the database.
> > > Connection pooler like pgbouncer or pgpool is not certified within the
> > > organization yet. So they are looking for setting up high configuration
> > > Hardware with CPU and Memory. Can someone advise how much memory and
> CPU
> > > they will need if they want max_conenction value=3000.
> >
> > Pgbouncer would be the best solution. CPU: number of concurrent
> connections.
> > RAM: shared_buffer + max_connections * work_mem + maintenance_mem +
> operating system + ...
>
> Right.  And then hope and pray that a) the database doesn't get overloaded
> and b) you don't hit any of the database-internal bottlenecks caused by
> many
> connections.
>
> I also got the feeling that the Linux kernel's memory accounting somehow
> lags.
> I have seen cases where every snapshot of "pg_stat_activity" I took showed
> only a few active connections (but each time different ones), but the
> amount of allocated memory exceeded what the currently active sessions
> could
> consume.  I may have made a mistake, and I have no reproducer, but I would
> be curious to know if there is an explanation for that.
> (I am aware that "top" shows shared buffers multiple times).
>
> Yours,
> Laurenz Albe
>


Re: Selecting RAM and CPU based on max_connections

2022-05-20 Thread Laurenz Albe
On Fri, 2022-05-20 at 12:15 +0200, Andreas Kretschmer wrote:
> On 20 May 2022 10:27:50 CEST, aditya desai  wrote:
> > One of our applications needs 3000 max_connections to the database.
> > Connection pooler like pgbouncer or pgpool is not certified within the
> > organization yet. So they are looking for setting up high configuration
> > Hardware with CPU and Memory. Can someone advise how much memory and CPU
> > they will need if they want max_conenction value=3000.
> 
> Pgbouncer would be the best solution. CPU: number of concurrent connections.
> RAM: shared_buffer + max_connections * work_mem + maintenance_mem + operating 
> system + ...

Right.  And then hope and pray that a) the database doesn't get overloaded
and b) you don't hit any of the database-internal bottlenecks caused by many
connections.

I also got the feeling that the Linux kernel's memory accounting somehow lags.
I have seen cases where every snapshot of "pg_stat_activity" I took showed
only a few active connections (but each time different ones), but the
amount of allocated memory exceeded what the currently active sessions could
consume.  I may have made a mistake, and I have no reproducer, but I would
be curious to know if there is an explanation for that.
(I am aware that "top" shows shared buffers multiple times).

Yours,
Laurenz Albe




Re: Selecting RAM and CPU based on max_connections

2022-05-20 Thread Andreas Kretschmer
On 20 May 2022 10:27:50 CEST, aditya desai  wrote:
>Hi,
>One of our applications needs 3000 max_connections to the database.
>Connection pooler like pgbouncer or pgpool is not certified within the
>organization yet. So they are looking for setting up high configuration
>Hardware with CPU and Memory. Can someone advise how much memory and CPU
>they will need if they want max_conenction value=3000.
>
>Regards,
>Aditya.

Pgbouncer would be the best solution. CPU: number of concurrent connections. 
RAM: shared_buffer + max_connections * work_mem + maintenance_mem + operating 
system + ...
 


-- 
2ndQuadrant - The PostgreSQL Support Company




Selecting RAM and CPU based on max_connections

2022-05-20 Thread aditya desai
Hi,
One of our applications needs 3000 max_connections to the database.
Connection pooler like pgbouncer or pgpool is not certified within the
organization yet. So they are looking for setting up high configuration
Hardware with CPU and Memory. Can someone advise how much memory and CPU
they will need if they want max_conenction value=3000.

Regards,
Aditya.