DO NOT REPLY TO THIS EMAIL, BUT PLEASE POST YOUR BUG 
RELATED COMMENTS THROUGH THE WEB INTERFACE AVAILABLE AT
<http://issues.apache.org/bugzilla/show_bug.cgi?id=25757>.
ANY REPLY MADE TO THIS MESSAGE WILL NOT BE COLLECTED AND 
INSERTED IN THE BUG DATABASE.

http://issues.apache.org/bugzilla/show_bug.cgi?id=25757

Too Many Files Open Error





------- Additional Comments From [EMAIL PROTECTED]  2004-07-14 17:35 -------
Kevin,

The hard file descriptor limit you set is per-process file descriptors 
(rlim_fd_max setting in /etc/system), right?

Your lsof output is showing 9000 total system-wide descriptors, right (vs. file 
descriptors for a particular httpd process)?

I don't see how you would be hitting per-process limits then.

What about system-wide or per-user limits?  (I dunno if Solaris has these 
concerns.)

As far as determining that there is an Apache bug which results in hitting some 
sort of file descriptor limit:

Use your lsof output to find out if any particular httpd process has an 
unreasonable set of open file descriptors.  Can you find such a process and 
summarize what sorts of files are open?  Perhaps there are numerous open 
sockets which have been connected to an LDAP server?

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to