Kobus Wolvaardt wrote:
Hi,

We have software deployed on our network that need postgres, we have server that hosts the server and all worked fine until we crossed about 200 users. The application is written so that it makes a connection right at the start and keeps it alive for the duration of the app. The app is written in Delphi. The postgres server runs on a windows 2008 server with quad core cpu and 4 GB of ram.

The problem after +-200 connections is that the server runs out of memory, but most of these connections are idle... it only gets used every 20 minutes to capture a transaction.

It looks like every idle connection uses about 10MB of ram which sees high, but I cannot find a config option to limit it.

I tried pgbouncer to do connection pooling, but for each connection to pgbouncer one connection is made to the server which results in exactly the same amount of connection. If I run it in transaction pooling mode it works for simple queries, but something goes lost says the programmer (views that were setup or something).

views are stored in the database and not connection specific. things that are connection specific are like schema search path, language encoding options, and so forth. if you setup the database properly, these shouldn't be an issue.

I would seriously plan on rewriting the app to use pg_pool or similar, and fetch a conncetion from a much smaller pool of actual database connections, use it, then release it back to the pool. For this to work, all your database connections need the same persistent settings.



--
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general

Reply via email to