My implementation is now using a 120 pool size (server is configured to 
allow 128 concurrently), my bottle neck now is the database. My 
connection pool is quickly exhausted in my test. What's a good strategy 
to improve the performance? I am thinking of queueing the database 
requests and only execute them when a certain number (say 50 sql 
statements) is reached?

In my case the postgres server simply stops responding after a while and 
allocated connection is not released from client...But CPU load is never 
really very high...Maybe there is something else wrong.

Fei
Jeroen T. Vermeulen wrote:
> On Thu, July 12, 2007 04:26, Bart Samwel wrote:
>   
>> Fei Liu wrote:
>>     
>
>   
>>> Thanks Jeroen, I am trying to get around with a connection_pool model. I
>>> am getting this error:
>>> what():  FATAL:  sorry, too many clients already
>>>
>>> What's the upper limit of concurrent connection pqxx supports? what
>>> freedom do I have to adjust it?
>>>       
>> There's no upper limit, as far as I know. This seems to be generated by
>> the Postgres server, you might want to check the server configuration!
>>     
>
> Correct.  There is no limit in libpqxx itself.  This error means there's
> probably a bug in the code that causes it to create a huge number of
> connections, or that your connection pool is too large.
>
>
> Jeroen
>
>
>   

_______________________________________________
Libpqxx-general mailing list
Libpqxx-general@gborg.postgresql.org
http://gborg.postgresql.org/mailman/listinfo/libpqxx-general

Reply via email to