[EMAIL PROTECTED] ("Bruno Almeida do Lago") wrote: > Is there a real limit for max_connections? Here we've an Oracle server with > up to 1200 simultaneous conections over it! > > "max_connections: exactly like previous versions, this needs to be set to > the actual number of simultaneous connections you expect to need. High > settings will require more shared memory (shared_buffers). As the > per-connection overhead, both from PostgreSQL and the host OS, can be quite > high, it's important to use connection pooling if you need to service a > large number of users. For example, 150 active connections on a medium-end > 32-bit Linux server will consume significant system resources, and 600 is > about the limit."
Right now, I have an Opteron box with: a) A load average of about 0.1, possibly less ;-), and b) 570 concurrent connections. Having so connections is something of a "fool's errand," as it really is ludicrously unnecessary, but I wouldn't be too afraid of having 1000 connections on that box, as long as they're being used for relatively small transactions. You can, of course, kill performance on any not-outrageously-large system if a few of those users are doing big queries... -- wm(X,Y):-write(X),write('@'),write(Y). wm('cbbrowne','gmail.com'). http://cbbrowne.com/info/slony.html I've had a perfectly wonderful evening. But this wasn't it. -- Groucho Marx ---------------------------(end of broadcast)--------------------------- TIP 9: the planner will ignore your desire to choose an index scan if your joining column's datatypes do not match