On 10/22/2013 12:59 PM, Stephen Frost wrote:
Andy,

* andy (a...@squeakycode.net) wrote:
My website is about to get a little more popular.  I'm trying to add
in some measurements to determine an upper limit of how many
concurrent database connections I'm currently using.

PG is really *much* happier if you have only one backend per CPU in your
system.  The way to get there is by using a connection pooler like
pg_bouncer and configuring it based on how many CPUs you have.

pg_bouncer can also provide stats for you.

        Thanks,

                Stephen


Ahh, bummer, man.  PgBouncer doesn't work so well when you have lots of 
databases.  I have about 90 databases, the website could connect to any one of 
them on any request.  (They are all about as equally likely to be hit)

In my pgbouncer.ini I set
[databases]
* =
[pgbouncer]
max_cleint_conn = 200
default_pool_size = 20

but that made each database its own pool, so 90*20 possible (but then maxed out at 200, I 
assume).  Postgres has only a max_connections of 40, so I started getting "FATAL 
sorry too many cleints already" errors.

I set the max_cleint_conn = 40
and default_pool_size = 3

so it should stop erroring out, but is this type of setup even going to help 
me?  If pgbouncer has 40 connections open to 40 different databases, and then a 
request comes in for a db it doesnt have, it'll have to drop one and build a 
new connection, yes?  Won't that be slower than just connecting right to 
postgres in the first place?

I need a Most-Recently-Used pool, not a per-database pool.  Is there a way to 
do this with pgbouncer?  (With a large number of databases)

-Andy


--
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general

Reply via email to