---------------------------------------------------------------- BEFORE YOU POST, search the faq at <http://java.apache.org/faq/> WHEN YOU POST, include all relevant version numbers, log files, and configuration files. Don't make us guess your problem!!! ---------------------------------------------------------------- Sockets also use file descriptors. If there are many simultaneous connections, you may run out of descriptors and be unable to open a new connection. Another scenario would be slow clients -- if each client takes a long time to read the data you send, sockets stay open longer and more are in use simultaneously than you might normally expect. Finally, after sockets are closed, there is timeout that has to elapse before the descriptor is fully released. And of course, a buggy application that fails to close sockets, or that just behaves badly can cause the same thing. An example: Where I work we have a website based on a "name brand" app server, running on Solaris with 4096 descriptors/process. Every few days we get "out of file descriptor" errors due to the webserver proxy erroneously sending bad requests to the app-server at a rate of 40+/sec. The sockets build up and cannot timeout fast enough. The next time you find encounter this problem, I suggest you use netstat -a on the JServ box to see how many sockets there in in what states. - Fernando |--------+-----------------------------------------> | | Ben Ricker <[EMAIL PROTECTED]> | | | Sent by: | | | <[EMAIL PROTECTED]| | | -dogs.com> | | | | | | | | | 01/12/2001 11:14 AM | | | Please respond to "Java Apache | | | Users" | | | | |--------+-----------------------------------------> >------------------------------------------------------------------------------------------------------------| | | | To: Java Apache Users <[EMAIL PROTECTED]> | | cc: | | Subject: File-Max Limit reached | >------------------------------------------------------------------------------------------------------------| ---------------------------------------------------------------- BEFORE YOU POST, search the faq at <http://java.apache.org/faq/> WHEN YOU POST, include all relevant version numbers, log files, and configuration files. Don't make us guess your problem!!! ---------------------------------------------------------------- I am banging my head up against the wall for the answer to a problem. This may or may not be related to Jserv/JVM issues but I thought I would ask the masses is they might have experienced the same problem. We have a web app which is servlet based running on Redhat 6.2 with Sun JVM 1.2.2 and Jserv 1.1.2. I got a call that our web app was unreacheable. I cannot get to the box remotely so I go to the console. I get an error to the effect that my file-max open file limit had been reached. I could not do anything as far as diagnoses on the box because I could not even open a shell. So I had to hard reboot the box and everything came back up fine. I have no users on this box so user mischief is not a possibility. I only run telnet and FTP for as few developers and ssh for admins. There are no other services bieng run except for java, httpd (apache), and system things like crond, syslogd, etc. Can anyone think of a scenario where jserv or the JVM may cause a massive opening of files? Massive logging, perhaps? Garbage Collection gone horribly wrong? I am grasping at straws here. Sorry if this is a little off-topic. If you think it is not Jserv that caused the problem, respond to me privately. Thanks! Ben Ricker Senior Systems Administrator US-Rx, Inc. -- -------------------------------------------------------------- Please read the FAQ! <http://java.apache.org/faq/> To subscribe: [EMAIL PROTECTED] To unsubscribe: [EMAIL PROTECTED] Search Archives: <http://www.mail-archive.com/java-apache-users%40list.working-dogs.com/> Problems?: [EMAIL PROTECTED]