Re: Too many open files error

2011-08-24 Thread Francis GALIEGUE
On Wed, Aug 24, 2011 at 17:21, Campbell, Lance la...@illinois.edu wrote:
 Tomcat 6.0.32
 Java 1.6.27
 Apache 2.0
 RedHat 6.x 64 bit
 /proc/sys/fs/file-max = 3233344

 We experienced an issue where we were getting the error too many open files 
 in tomcat.  The server manager increase the amount of open files to the 
 above.  But the error kept coming back even after rebooting the server.  Is 
 there a max number of connections that tomcat should run within based on the 
 above specs?

 The servlet in question that was being hit returned an XML document after 
 doing a series of database queries.


file-max is not what you want to modify. It's the user's limit: RMILIT_NFILE.

Look in /etc/security/limits.d. You'll need to restart Tomcat.

-- 
Francis Galiegue
ONE2TEAM
Ingénieur système
Mob : +33 (0) 683 877 875
Tel : +33 (0) 178 945 552
f...@one2team.com
40 avenue Raymond Poincaré
75116 Paris

-
To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
For additional commands, e-mail: users-h...@tomcat.apache.org



RE: Too many open files error

2011-08-24 Thread Campbell, Lance
The file /etc/security/limits.d is empty.  What would be an example of 
something you would expect to see in there that would relate to changing the 
RMILIT_NFILE value? 

Thanks,  

From: Francis GALIEGUE [f...@one2team.com]
Sent: Wednesday, August 24, 2011 10:24 AM
To: Tomcat Users List
Subject: Re: Too many open files error

On Wed, Aug 24, 2011 at 17:21, Campbell, Lance la...@illinois.edu wrote:
 Tomcat 6.0.32
 Java 1.6.27
 Apache 2.0
 RedHat 6.x 64 bit
 /proc/sys/fs/file-max = 3233344

 We experienced an issue where we were getting the error too many open files 
 in tomcat.  The server manager increase the amount of open files to the 
 above.  But the error kept coming back even after rebooting the server.  Is 
 there a max number of connections that tomcat should run within based on the 
 above specs?

 The servlet in question that was being hit returned an XML document after 
 doing a series of database queries.


file-max is not what you want to modify. It's the user's limit: RMILIT_NFILE.

Look in /etc/security/limits.d. You'll need to restart Tomcat.

--
Francis Galiegue
ONE2TEAM
Ingénieur système
Mob : +33 (0) 683 877 875
Tel : +33 (0) 178 945 552
f...@one2team.com
40 avenue Raymond Poincaré
75116 Paris

-
To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
For additional commands, e-mail: users-h...@tomcat.apache.org


-
To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
For additional commands, e-mail: users-h...@tomcat.apache.org



Re: Too many open files error

2011-08-24 Thread Francis GALIEGUE
On Wed, Aug 24, 2011 at 17:33, Campbell, Lance la...@illinois.edu wrote:
 The file /etc/security/limits.d is empty.  What would be an example of 
 something you would expect to see in there that would relate to changing the 
 RMILIT_NFILE value?


It's a directory, not a file. Create a file named tomcat in it (or
whatever name you want) and put in these two lines:

tomcat softnofile  16384
tomcat hardnofile  16384

(if the user running Tomcat is indeed called tomcat)

If you want to see the current limit, as root, run:

su tomcat -c ulimit -n

-- 
Francis Galiegue
ONE2TEAM
Ingénieur système
Mob : +33 (0) 683 877 875
Tel : +33 (0) 178 945 552
f...@one2team.com
40 avenue Raymond Poincaré
75116 Paris

-
To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
For additional commands, e-mail: users-h...@tomcat.apache.org



Re: Too many open files error

2011-08-24 Thread Christopher Schultz
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Lance,

On 8/24/2011 11:21 AM, Campbell, Lance wrote:
 Tomcat 6.0.32 Java 1.6.27 Apache 2.0 RedHat 6.x 64 bit 
 /proc/sys/fs/file-max = 3233344
 
 We experienced an issue where we were getting the error too many 
 open files in tomcat.  The server manager increase the amount of 
 open files to the above.  But the error kept coming back even
 after rebooting the server.  Is there a max number of connections
 that tomcat should run within based on the above specs?
 
 The servlet in question that was being hit returned an XML
 document after doing a series of database queries.

You may find that the problem isn't the number of on-disk files but
the number of file descriptors, which might actually have different
meanings on your system.

It's also possible that the JVM is giving you a spurious message about
too many files when the problem is really the number of /threads/ --
I've seen that in the past, too.

Can you post the exact stack trace that you got along with this error?
Also, how about the output of ulimit -a for the user that actually
runs Tomcat? Finally, what do your Connector elements look like in
conf/server.xml?

- -chris
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.10 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/

iEYEARECAAYFAk5VRJ8ACgkQ9CaO5/Lv0PCD1gCgk7+nPVcTrN87QRiceYYYjnfi
SdEAoK/4AUixlaSqINfTdnLHty+/zI/B
=aigt
-END PGP SIGNATURE-

-
To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
For additional commands, e-mail: users-h...@tomcat.apache.org



RE: too many open files issue in tomcat

2011-06-27 Thread Guy Katz
I think you will get better help by providing the following:
 -Which tomcat version are you using?
-Which OS are you deploying on?
-What is your memory setting for tomcat (if explicitly set)?
-What's your file descriptor configuration in the OS (if explicitly
set)?
-does the problem arrive alongside an out of memory error?
-are there any 'IO heavy' processes that run on your deployment machine
along with tomcat (DB, etc)?

-Original Message-
From: dasari@wipro.com [mailto:dasari@wipro.com] 
Sent: Monday, June 27, 2011 12:54 PM
To: users@tomcat.apache.org
Subject: too many open files issue in tomcat

Hi,

 

I am facing the issue of too many open files in the tomcat and not
able to process any request further. Did somebody faced the same problem
and what is the problem and solution for the same.

 

This issue is creating lot of problem on the production systems and if
somebody has already solved this issue, then provide us the solution.

 

Regards

Dayakar


Please do not print this email unless it is absolutely necessary. 

The information contained in this electronic message and any attachments
to this message are intended for the exclusive use of the addressee(s)
and may contain proprietary, confidential or privileged information. If
you are not the intended recipient, you should not disseminate,
distribute or copy this e-mail. Please notify the sender immediately and
destroy all copies of this message and any attachments. 

WARNING: Computer viruses can be transmitted via email. The recipient
should check this email and any attachments for the presence of viruses.
The company accepts no liability for any damage caused by any virus
transmitted by this email. 

www.wipro.com
##
This message is intended only for the designated recipient(s).It may contain 
confidential or proprietary information.
If you are not the designated recipient, you may not review, copy or distribute 
this message.
If you have mistakenly received this message, please notify the sender by a 
reply e-mail and delete this message. 
Thank you.
##

-
To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
For additional commands, e-mail: users-h...@tomcat.apache.org



RE: too many open files issue in tomcat

2011-06-27 Thread dasari.rao
Hi,

Tomcat version is 6.0.29
OS is RHEL5.5-1
-Xms256m -Xmx768m
No explicit file descriptor configuration its default of OS
Not observed the out of memory error but some time found the socket
problem
Only tomcat is running on the server but communicates with the DB on the
other server.

Regards
Dayakar
-Original Message-
From: Guy Katz [mailto:gk...@allot.com]
Sent: Monday, June 27, 2011 3:38 PM
To: users@tomcat.apache.org
Subject: RE: too many open files issue in tomcat

I think you will get better help by providing the following:
 -Which tomcat version are you using?
-Which OS are you deploying on?
-What is your memory setting for tomcat (if explicitly set)?
-What's your file descriptor configuration in the OS (if explicitly
set)?
-does the problem arrive alongside an out of memory error?
-are there any 'IO heavy' processes that run on your deployment machine
along with tomcat (DB, etc)?

-Original Message-
From: dasari@wipro.com [mailto:dasari@wipro.com]
Sent: Monday, June 27, 2011 12:54 PM
To: users@tomcat.apache.org
Subject: too many open files issue in tomcat

Hi,



I am facing the issue of too many open files in the tomcat and not
able to process any request further. Did somebody faced the same problem
and what is the problem and solution for the same.



This issue is creating lot of problem on the production systems and if
somebody has already solved this issue, then provide us the solution.



Regards

Dayakar


Please do not print this email unless it is absolutely necessary.

The information contained in this electronic message and any attachments
to this message are intended for the exclusive use of the addressee(s)
and may contain proprietary, confidential or privileged information. If
you are not the intended recipient, you should not disseminate,
distribute or copy this e-mail. Please notify the sender immediately and
destroy all copies of this message and any attachments.

WARNING: Computer viruses can be transmitted via email. The recipient
should check this email and any attachments for the presence of viruses.
The company accepts no liability for any damage caused by any virus
transmitted by this email.

www.wipro.com

##
This message is intended only for the designated recipient(s).It may
contain confidential or proprietary information.
If you are not the designated recipient, you may not review, copy or
distribute this message.
If you have mistakenly received this message, please notify the sender
by a reply e-mail and delete this message.
Thank you.

##

-
To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
For additional commands, e-mail: users-h...@tomcat.apache.org


Please do not print this email unless it is absolutely necessary. 

The information contained in this electronic message and any attachments to 
this message are intended for the exclusive use of the addressee(s) and may 
contain proprietary, confidential or privileged information. If you are not the 
intended recipient, you should not disseminate, distribute or copy this e-mail. 
Please notify the sender immediately and destroy all copies of this message and 
any attachments. 

WARNING: Computer viruses can be transmitted via email. The recipient should 
check this email and any attachments for the presence of viruses. The company 
accepts no liability for any damage caused by any virus transmitted by this 
email. 

www.wipro.com

-
To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
For additional commands, e-mail: users-h...@tomcat.apache.org



Re: too many open files issue in tomcat

2011-06-27 Thread Mark Thomas
On 27/06/2011 11:21, dasari@wipro.com wrote:
 Hi,
 
 Tomcat version is 6.0.29
 OS is RHEL5.5-1
 -Xms256m -Xmx768m
 No explicit file descriptor configuration its default of OS
 Not observed the out of memory error but some time found the socket
 problem
 Only tomcat is running on the server but communicates with the DB on the
 other server.

Search the archives / look in the FAQ (I think this in in there).

Mark

 
 Regards
 Dayakar
 -Original Message-
 From: Guy Katz [mailto:gk...@allot.com] 
 Sent: Monday, June 27, 2011 3:38 PM
 To: users@tomcat.apache.org
 Subject: RE: too many open files issue in tomcat
 
 I think you will get better help by providing the following:
  -Which tomcat version are you using?
 -Which OS are you deploying on?
 -What is your memory setting for tomcat (if explicitly set)?
 -What's your file descriptor configuration in the OS (if explicitly
 set)?
 -does the problem arrive alongside an out of memory error?
 -are there any 'IO heavy' processes that run on your deployment machine
 along with tomcat (DB, etc)?
 
 -Original Message-
 From: dasari@wipro.com [mailto:dasari@wipro.com] 
 Sent: Monday, June 27, 2011 12:54 PM
 To: users@tomcat.apache.org
 Subject: too many open files issue in tomcat
 
 Hi,
 
  
 
 I am facing the issue of too many open files in the tomcat and not
 able to process any request further. Did somebody faced the same problem
 and what is the problem and solution for the same.
 
  
 
 This issue is creating lot of problem on the production systems and if
 somebody has already solved this issue, then provide us the solution.
 
  
 
 Regards
 
 Dayakar
 
 
 Please do not print this email unless it is absolutely necessary. 
 
 The information contained in this electronic message and any attachments
 to this message are intended for the exclusive use of the addressee(s)
 and may contain proprietary, confidential or privileged information. If
 you are not the intended recipient, you should not disseminate,
 distribute or copy this e-mail. Please notify the sender immediately and
 destroy all copies of this message and any attachments. 
 
 WARNING: Computer viruses can be transmitted via email. The recipient
 should check this email and any attachments for the presence of viruses.
 The company accepts no liability for any damage caused by any virus
 transmitted by this email. 
 
 www.wipro.com
 
 ##
 This message is intended only for the designated recipient(s).It may
 contain confidential or proprietary information.
 If you are not the designated recipient, you may not review, copy or
 distribute this message.
 If you have mistakenly received this message, please notify the sender
 by a reply e-mail and delete this message. 
 Thank you.
 
 ##
 
 -
 To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
 For additional commands, e-mail: users-h...@tomcat.apache.org
 
 
 Please do not print this email unless it is absolutely necessary. 
 
 The information contained in this electronic message and any attachments to 
 this message are intended for the exclusive use of the addressee(s) and may 
 contain proprietary, confidential or privileged information. If you are not 
 the intended recipient, you should not disseminate, distribute or copy this 
 e-mail. Please notify the sender immediately and destroy all copies of this 
 message and any attachments. 
 
 WARNING: Computer viruses can be transmitted via email. The recipient should 
 check this email and any attachments for the presence of viruses. The company 
 accepts no liability for any damage caused by any virus transmitted by this 
 email. 
 
 www.wipro.com
 
 -
 To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
 For additional commands, e-mail: users-h...@tomcat.apache.org
 




-
To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
For additional commands, e-mail: users-h...@tomcat.apache.org



Re: too many open files issue in tomcat

2011-06-27 Thread Jason Viloria
On Mon, Jun 27, 2011 at 12:30 PM, Mark Thomas ma...@apache.org wrote:

 On 27/06/2011 11:21, dasari@wipro.com wrote:
  Hi,
 
  Tomcat version is 6.0.29
  OS is RHEL5.5-1
  -Xms256m -Xmx768m
  No explicit file descriptor configuration its default of OS
  Not observed the out of memory error but some time found the socket
  problem
  Only tomcat is running on the server but communicates with the DB on the
  other server.

 Search the archives / look in the FAQ (I think this in in there).


Just talking from what I have experienced, most of the time I have
encountered this problem is because there is a bottleneck somehwere in the
code. Tomcat process needing to communicate via soap and remote system takes
too long to timeout hence tomcat processes are lying around wasting fd.
Another common issue is DB bottlenecks once again causing tomcat processes
to lay around for too long and as a result don't get freed up fast enough to
further serve clients.  Also you need to see how many connections are coming
in to your system using any network monitoring tool you can put on, it could
really be just that you are reaching fd limit and hence must edit your
ulimits.Hope this helps.

/Jason


Re: Too many open files

2010-05-25 Thread André Warnier

pri...@samea.de wrote:

Hello,

I have a problem with my little CORBA-Servlet.
Allways I get this error:

java.net.SocketException: Too many open files
sun.nio.ch.Net.socket0(Native Method)
sun.nio.ch.Net.socket(Net.java:97)
sun.nio.ch.SocketChannelImpl.init(SocketChannelImpl.java:84)
sun.nio.ch.SelectorProviderImpl.openSocketChannel(SelectorProviderImpl.java:37) 


java.nio.channels.SocketChannel.open(SocketChannel.java:105)
java.nio.channels.SocketChannel.open(SocketChannel.java:145)
com.sun.corba.se.impl.transport.DefaultSocketFactoryImpl.createSocket(DefaultSocketFactoryImpl.java:60) 

com.sun.corba.se.impl.transport.SocketOrChannelConnectionImpl.init(SocketOrChannelConnectionImpl.java:188) 

com.sun.corba.se.impl.transport.SocketOrChannelConnectionImpl.init(SocketOrChannelConnectionImpl.java:218) 

com.sun.corba.se.impl.transport.SocketOrChannelContactInfoImpl.createConnection(SocketOrChannelContactInfoImpl.java:101) 

com.sun.corba.se.impl.protocol.CorbaClientRequestDispatcherImpl.beginRequest(CorbaClientRequestDispatcherImpl.java:152) 

com.sun.corba.se.impl.protocol.CorbaClientDelegateImpl.request(CorbaClientDelegateImpl.java:118) 

com.sun.corba.se.impl.protocol.CorbaClientDelegateImpl.is_a(CorbaClientDelegateImpl.java:211) 


org.omg.CORBA.portable.ObjectImpl._is_a(ObjectImpl.java:112)

I use: Tomcat 6.0.26 on Linux JVM: 1.6.0_20-b02

At first sight, it doesn't look as if this is a Tomcat issue.  It seems 
more of an issue within your servlet.

Is this servlet opening its own connection to something else?
If yes, then you probably forget to close this connection when you are 
done with it, and they accumulate until the OS tells your process that 
it has too many open sockets at the same time.


Doing a netstat -an would probably provide more information.

If you are under Unix/Linux, you can also try lsof, but only studying 
the options is already quite a challenge.


-
To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
For additional commands, e-mail: users-h...@tomcat.apache.org



Re: Too many open files

2010-05-25 Thread Pid
On 25/05/2010 10:26, pri...@samea.de wrote:
 Hello,
 
 I have a problem with my little CORBA-Servlet.
 Allways I get this error:
 
 java.net.SocketException: Too many open files
 sun.nio.ch.Net.socket0(Native Method)
 sun.nio.ch.Net.socket(Net.java:97)
 sun.nio.ch.SocketChannelImpl.init(SocketChannelImpl.java:84)
 
 sun.nio.ch.SelectorProviderImpl.openSocketChannel(SelectorProviderImpl.java:37)
 
 java.nio.channels.SocketChannel.open(SocketChannel.java:105)
 java.nio.channels.SocketChannel.open(SocketChannel.java:145)
 
 com.sun.corba.se.impl.transport.DefaultSocketFactoryImpl.createSocket(DefaultSocketFactoryImpl.java:60)
 
 com.sun.corba.se.impl.transport.SocketOrChannelConnectionImpl.init(SocketOrChannelConnectionImpl.java:188)
 
 
 com.sun.corba.se.impl.transport.SocketOrChannelConnectionImpl.init(SocketOrChannelConnectionImpl.java:218)
 
 
 com.sun.corba.se.impl.transport.SocketOrChannelContactInfoImpl.createConnection(SocketOrChannelContactInfoImpl.java:101)
 
 
 com.sun.corba.se.impl.protocol.CorbaClientRequestDispatcherImpl.beginRequest(CorbaClientRequestDispatcherImpl.java:152)
 
 
 com.sun.corba.se.impl.protocol.CorbaClientDelegateImpl.request(CorbaClientDelegateImpl.java:118)
 
 
 com.sun.corba.se.impl.protocol.CorbaClientDelegateImpl.is_a(CorbaClientDelegateImpl.java:211)
 
 org.omg.CORBA.portable.ObjectImpl._is_a(ObjectImpl.java:112)
 
 I use: Tomcat 6.0.26 on Linux JVM: 1.6.0_20-b02
 
 Thanks for your help.

Please start a completely new email, rather than replying to an existing
one and editing the subject  body - which is called thread-hijacking.


p

 br,
 
 Markus
 
 -
 To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
 For additional commands, e-mail: users-h...@tomcat.apache.org
 




signature.asc
Description: OpenPGP digital signature


RE: too many open files

2010-05-25 Thread privat
Hello,

I had a look at my servlet, but it closes the connection after
doing a request.

Further it crashes by opening the nameserver.

Thanks for your help.

br,

Markus


pri...@samea.de wrote:
 Hello,
 
 I have a problem with my little CORBA-Servlet.
 Allways I get this error:
 
 java.net.SocketException: Too many open files
 sun.nio.ch.Net.socket0(Native Method)
 sun.nio.ch.Net.socket(Net.java:97)
 sun.nio.ch.SocketChannelImpl.init(SocketChannelImpl.java:84)
 
 sun.nio.ch.SelectorProviderImpl.openSocketChannel(SelectorProviderImpl.java:37)
  
 
 java.nio.channels.SocketChannel.open(SocketChannel.java:105)
 java.nio.channels.SocketChannel.open(SocketChannel.java:145)
 
 com.sun.corba.se.impl.transport.DefaultSocketFactoryImpl.createSocket(DefaultSocketFactoryImpl.java:60)
  
 
 com.sun.corba.se.impl.transport.SocketOrChannelConnectionImpl.init(SocketOrChannelConnectionImpl.java:188)
  
 
 
 com.sun.corba.se.impl.transport.SocketOrChannelConnectionImpl.init(SocketOrChannelConnectionImpl.java:218)
  
 
 
 com.sun.corba.se.impl.transport.SocketOrChannelContactInfoImpl.createConnection(SocketOrChannelContactInfoImpl.java:101)
  
 
 
 com.sun.corba.se.impl.protocol.CorbaClientRequestDispatcherImpl.beginRequest(CorbaClientRequestDispatcherImpl.java:152)
  
 
 
 com.sun.corba.se.impl.protocol.CorbaClientDelegateImpl.request(CorbaClientDelegateImpl.java:118)
  
 
 
 com.sun.corba.se.impl.protocol.CorbaClientDelegateImpl.is_a(CorbaClientDelegateImpl.java:211)
  
 
 org.omg.CORBA.portable.ObjectImpl._is_a(ObjectImpl.java:112)
 
 I use: Tomcat 6.0.26 on Linux JVM: 1.6.0_20-b02
 
At first sight, it doesn't look as if this is a Tomcat issue.  It seems 
more of an issue within your servlet.
Is this servlet opening its own connection to something else?
If yes, then you probably forget to close this connection when you are 
done with it, and they accumulate until the OS tells your process that 
it has too many open sockets at the same time.

Doing a netstat -an would probably provide more information.

If you are under Unix/Linux, you can also try lsof, but only studying 
the options is already quite a challenge.

-
To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
For additional commands, e-mail: users-h...@tomcat.apache.org


Re: too many open files

2010-05-25 Thread Pid
On 25/05/2010 11:38, pri...@samea.de wrote:
 Hello,
 
 I had a look at my servlet, but it closes the connection after
 doing a request.

Does it close the connection if the request throws an error?


p

 Further it crashes by opening the nameserver.
 
 Thanks for your help.
 
 br,
 
 Markus
 
 
 pri...@samea.de wrote:
 Hello,

 I have a problem with my little CORBA-Servlet.
 Allways I get this error:

 java.net.SocketException: Too many open files
 sun.nio.ch.Net.socket0(Native Method)
 sun.nio.ch.Net.socket(Net.java:97)
 sun.nio.ch.SocketChannelImpl.init(SocketChannelImpl.java:84)
 
 sun.nio.ch.SelectorProviderImpl.openSocketChannel(SelectorProviderImpl.java:37)
  

 java.nio.channels.SocketChannel.open(SocketChannel.java:105)
 java.nio.channels.SocketChannel.open(SocketChannel.java:145)
 
 com.sun.corba.se.impl.transport.DefaultSocketFactoryImpl.createSocket(DefaultSocketFactoryImpl.java:60)
  

 com.sun.corba.se.impl.transport.SocketOrChannelConnectionImpl.init(SocketOrChannelConnectionImpl.java:188)
  

 
 com.sun.corba.se.impl.transport.SocketOrChannelConnectionImpl.init(SocketOrChannelConnectionImpl.java:218)
  

 
 com.sun.corba.se.impl.transport.SocketOrChannelContactInfoImpl.createConnection(SocketOrChannelContactInfoImpl.java:101)
  

 
 com.sun.corba.se.impl.protocol.CorbaClientRequestDispatcherImpl.beginRequest(CorbaClientRequestDispatcherImpl.java:152)
  

 
 com.sun.corba.se.impl.protocol.CorbaClientDelegateImpl.request(CorbaClientDelegateImpl.java:118)
  

 
 com.sun.corba.se.impl.protocol.CorbaClientDelegateImpl.is_a(CorbaClientDelegateImpl.java:211)
  

 org.omg.CORBA.portable.ObjectImpl._is_a(ObjectImpl.java:112)

 I use: Tomcat 6.0.26 on Linux JVM: 1.6.0_20-b02

 At first sight, it doesn't look as if this is a Tomcat issue.  It seems 
 more of an issue within your servlet.
 Is this servlet opening its own connection to something else?
 If yes, then you probably forget to close this connection when you are 
 done with it, and they accumulate until the OS tells your process that 
 it has too many open sockets at the same time.

 Doing a netstat -an would probably provide more information.

 If you are under Unix/Linux, you can also try lsof, but only studying 
 the options is already quite a challenge.

 -
 To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
 For additional commands, e-mail: users-h...@tomcat.apache.org
 




signature.asc
Description: OpenPGP digital signature


RE: too many open files

2010-05-25 Thread Caldarale, Charles R
 From: Pid [mailto:p...@pidster.com]
 Subject: Re: too many open files
 
  I had a look at my servlet, but it closes the connection after
  doing a request.
 
 Does it close the connection if the request throws an error?

And as previously suggested, use netstat to find out if the excess files really 
are sockets.

 - Chuck


THIS COMMUNICATION MAY CONTAIN CONFIDENTIAL AND/OR OTHERWISE PROPRIETARY 
MATERIAL and is thus for use only by the intended recipient. If you received 
this in error, please contact the sender and delete the e-mail and its 
attachments from all computers.


-
To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
For additional commands, e-mail: users-h...@tomcat.apache.org



Re: Too many open files

2010-05-25 Thread Rainer Jung

On 25.05.2010 11:54, André Warnier wrote:

pri...@samea.de wrote:

Hello,

I have a problem with my little CORBA-Servlet.
Allways I get this error:

java.net.SocketException: Too many open files
sun.nio.ch.Net.socket0(Native Method)
sun.nio.ch.Net.socket(Net.java:97)
sun.nio.ch.SocketChannelImpl.init(SocketChannelImpl.java:84)
sun.nio.ch.SelectorProviderImpl.openSocketChannel(SelectorProviderImpl.java:37)

java.nio.channels.SocketChannel.open(SocketChannel.java:105)
java.nio.channels.SocketChannel.open(SocketChannel.java:145)
com.sun.corba.se.impl.transport.DefaultSocketFactoryImpl.createSocket(DefaultSocketFactoryImpl.java:60)

com.sun.corba.se.impl.transport.SocketOrChannelConnectionImpl.init(SocketOrChannelConnectionImpl.java:188)

com.sun.corba.se.impl.transport.SocketOrChannelConnectionImpl.init(SocketOrChannelConnectionImpl.java:218)

com.sun.corba.se.impl.transport.SocketOrChannelContactInfoImpl.createConnection(SocketOrChannelContactInfoImpl.java:101)

com.sun.corba.se.impl.protocol.CorbaClientRequestDispatcherImpl.beginRequest(CorbaClientRequestDispatcherImpl.java:152)

com.sun.corba.se.impl.protocol.CorbaClientDelegateImpl.request(CorbaClientDelegateImpl.java:118)

com.sun.corba.se.impl.protocol.CorbaClientDelegateImpl.is_a(CorbaClientDelegateImpl.java:211)

org.omg.CORBA.portable.ObjectImpl._is_a(ObjectImpl.java:112)

I use: Tomcat 6.0.26 on Linux JVM: 1.6.0_20-b02


At first sight, it doesn't look as if this is a Tomcat issue. It seems
more of an issue within your servlet.
Is this servlet opening its own connection to something else?
If yes, then you probably forget to close this connection when you are
done with it, and they accumulate until the OS tells your process that
it has too many open sockets at the same time.

Doing a netstat -an would probably provide more information.

If you are under Unix/Linux, you can also try lsof, but only studying
the options is already quite a challenge.


On Linux you can get the most important file descriptor info also by 
looking at the proc filesystem. If your process has process id (PID) 
XYZ, then do:


   ls -l /proc/XYZ/fd

It will list all open file descriptors, so you can find out, why there 
are so many open ones (are they files, sockets, ...). If they are 
sockets, you can list the association betweeb the sockets and the PID 
(=XYZ) using


   netstat -anp

Finally if you think you will need only a bit more descriptors, then 
have a look at the ulimit command (man ulimit). There are hard and 
soft limits. You can look at the active limits with ulimit -a and 
ulimit -H -a. The one you are interested in is open files. You might 
be able to set another limit with ulimit -n.


Regards,

Rainer

-
To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
For additional commands, e-mail: users-h...@tomcat.apache.org



Re: Too many open files

2008-10-10 Thread Christopher Schultz
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Mohit,

Mohit Anchlia wrote:
 So I tried all the options. I also changed the code to use connection
 pooling with only 2 connections but still there are bunch of
 CLOSE_WAITS. As soon as I stop tomcat all of them go away. I am not
 able to figure out why there are so many CLOSE_WAITS hanging around
 when I just have 2 connections in my pool.

These are mostly HTTP connections to localhost, right? Maybe you are
using connection timeout options that are too long for your quick
transactions. CLOSE_WAIT is a normal TCP state, but if these polie up on
top of each other because of long (minutes?) timeouts then you can
easily run out of file handles (socket ~= file handle, which is why you
are getting the too many open files error).

Consider setting some of these timeout options on HttpClient (if such
options exist) or researching the defaults for these options. Also, make
sure you are cleaning up after your connections appropriately (properly
catching IOExceptions, closing connections in finally blocks, etc.). If
connections are closing unexpectedly, they may be sitting in CLOSE_WAIT
longer than necessary.

Hope that helps,
- -chris
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iEYEARECAAYFAkjvqzkACgkQ9CaO5/Lv0PA8dgCdHtFiD0gquai4yEBXOKdZFOrm
bsEAoKOXMxo+u5I1EW2MQPuWvLJGhEYe
=m2kr
-END PGP SIGNATURE-

-
To start a new topic, e-mail: users@tomcat.apache.org
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: Too many open files

2008-10-10 Thread Christopher Schultz
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Johnny,

Johnny Kewl wrote:
 PS: I see you have a Apache in front...
 Try this... setup the 8080 connector if you havnt already got it working
 in TC... and go into TC direct... then check file handles
 If no problem, you know its Apache or the JK connector...
 ... will at least point you in right direction...

Oh, I hadn't even considered that. Are you using Apache httpd +
mod_proxy_http? If so, then the connections might be the ones from
Apache httpd to Tomcat, as Johnns suggests. In that case, you'll want to
inspect your httpd configuration for socket timeout configuration options.

Also, you might be able to remove Apache httpd from the mix entirely.
Are you sure you need to front Tomcat with Apache?

- -chris

-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iEYEARECAAYFAkjvq7YACgkQ9CaO5/Lv0PAcbwCfTs6rmtPRXkGzqxQe3WvOMJiJ
PkkAnRHrOl6QuoqOipqcyoCw3eZbcUjh
=qWJz
-END PGP SIGNATURE-

-
To start a new topic, e-mail: users@tomcat.apache.org
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: Too many open files

2008-10-10 Thread Mohit Anchlia
I am using tomcat to apache (for load balancing)

On Fri, Oct 10, 2008 at 12:23 PM, Christopher Schultz
[EMAIL PROTECTED] wrote:
 -BEGIN PGP SIGNED MESSAGE-
 Hash: SHA1

 Johnny,

 Johnny Kewl wrote:
 PS: I see you have a Apache in front...
 Try this... setup the 8080 connector if you havnt already got it working
 in TC... and go into TC direct... then check file handles
 If no problem, you know its Apache or the JK connector...
 ... will at least point you in right direction...

 Oh, I hadn't even considered that. Are you using Apache httpd +
 mod_proxy_http? If so, then the connections might be the ones from
 Apache httpd to Tomcat, as Johnns suggests. In that case, you'll want to
 inspect your httpd configuration for socket timeout configuration options.

 Also, you might be able to remove Apache httpd from the mix entirely.
 Are you sure you need to front Tomcat with Apache?

 - -chris

 -BEGIN PGP SIGNATURE-
 Version: GnuPG v1.4.9 (MingW32)
 Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

 iEYEARECAAYFAkjvq7YACgkQ9CaO5/Lv0PAcbwCfTs6rmtPRXkGzqxQe3WvOMJiJ
 PkkAnRHrOl6QuoqOipqcyoCw3eZbcUjh
 =qWJz
 -END PGP SIGNATURE-

 -
 To start a new topic, e-mail: users@tomcat.apache.org
 To unsubscribe, e-mail: [EMAIL PROTECTED]
 For additional commands, e-mail: [EMAIL PROTECTED]



-
To start a new topic, e-mail: users@tomcat.apache.org
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: Too many open files

2008-10-08 Thread Konstantin Kolinko
2008/10/8 Mohit Anchlia [EMAIL PROTECTED]:
 I can see you can't wait to hear the debate. Anyhow, I am using
 HttpClient from apache commons and I do have .getReleaseConnection().


See comment #22 here:
https://issues.apache.org/bugzilla/show_bug.cgi?id=28727#c22
and the message thread that it refers, [1]:

[1] http://www.mail-archive.com/[EMAIL PROTECTED]/msg04338.html

I do not know, if that applies to your case. That message thread ([1]) is
of December 2003, thus I do not know whether it is still applicable.

Just my 0.02 EUR.

Best regards,
Konstantin Kolinko

-
To start a new topic, e-mail: users@tomcat.apache.org
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: Too many open files

2008-10-08 Thread Mohit Anchlia
So I tried all the options. I also changed the code to use connection
pooling with only 2 connections but still there are bunch of
CLOSE_WAITS. As soon as I stop tomcat all of them go away. I am not
able to figure out why there are so many CLOSE_WAITS hanging around
when I just have 2 connections in my pool.

On Wed, Oct 8, 2008 at 6:14 AM, Konstantin Kolinko
[EMAIL PROTECTED] wrote:
 2008/10/8 Mohit Anchlia [EMAIL PROTECTED]:
 I can see you can't wait to hear the debate. Anyhow, I am using
 HttpClient from apache commons and I do have .getReleaseConnection().


 See comment #22 here:
 https://issues.apache.org/bugzilla/show_bug.cgi?id=28727#c22
 and the message thread that it refers, [1]:

 [1] http://www.mail-archive.com/[EMAIL PROTECTED]/msg04338.html

 I do not know, if that applies to your case. That message thread ([1]) is
 of December 2003, thus I do not know whether it is still applicable.

 Just my 0.02 EUR.

 Best regards,
 Konstantin Kolinko

 -
 To start a new topic, e-mail: users@tomcat.apache.org
 To unsubscribe, e-mail: [EMAIL PROTECTED]
 For additional commands, e-mail: [EMAIL PROTECTED]



-
To start a new topic, e-mail: users@tomcat.apache.org
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: Too many open files

2008-10-08 Thread Serge Fonville
I'm recently new to tomcat, but still I hope I am able to help a little.On
one end, any connection object should be nulled so the garbage collector can
pick it up.
On the other end, in server.xml define an executor and refer it in the
connector, that way, you can specify the idle time and the maximum amount of
threads you want.

Hope this helps...

Serge

On Wed, Oct 8, 2008 at 11:24 PM, Mohit Anchlia [EMAIL PROTECTED]wrote:

 So I tried all the options. I also changed the code to use connection
 pooling with only 2 connections but still there are bunch of
 CLOSE_WAITS. As soon as I stop tomcat all of them go away. I am not
 able to figure out why there are so many CLOSE_WAITS hanging around
 when I just have 2 connections in my pool.

 On Wed, Oct 8, 2008 at 6:14 AM, Konstantin Kolinko
 [EMAIL PROTECTED] wrote:
  2008/10/8 Mohit Anchlia [EMAIL PROTECTED]:
  I can see you can't wait to hear the debate. Anyhow, I am using
  HttpClient from apache commons and I do have .getReleaseConnection().
 
 
  See comment #22 here:
  https://issues.apache.org/bugzilla/show_bug.cgi?id=28727#c22
  and the message thread that it refers, [1]:
 
  [1]
 http://www.mail-archive.com/[EMAIL PROTECTED]/msg04338.html
 
  I do not know, if that applies to your case. That message thread ([1]) is
  of December 2003, thus I do not know whether it is still applicable.
 
  Just my 0.02 EUR.
 
  Best regards,
  Konstantin Kolinko
 
  -
  To start a new topic, e-mail: users@tomcat.apache.org
  To unsubscribe, e-mail: [EMAIL PROTECTED]
  For additional commands, e-mail: [EMAIL PROTECTED]
 
 

 -
 To start a new topic, e-mail: users@tomcat.apache.org
 To unsubscribe, e-mail: [EMAIL PROTECTED]
 For additional commands, e-mail: [EMAIL PROTECTED]




Re: Too many open files

2008-10-08 Thread Mohit Anchlia
I don't know how that will help in lowering down the CLOSE_WAITS from
tomcat to apache. I'll look at it though.

On Wed, Oct 8, 2008 at 3:07 PM, Serge Fonville [EMAIL PROTECTED] wrote:
 I'm recently new to tomcat, but still I hope I am able to help a little.On
 one end, any connection object should be nulled so the garbage collector can
 pick it up.
 On the other end, in server.xml define an executor and refer it in the
 connector, that way, you can specify the idle time and the maximum amount of
 threads you want.

 Hope this helps...

 Serge

 On Wed, Oct 8, 2008 at 11:24 PM, Mohit Anchlia [EMAIL PROTECTED]wrote:

 So I tried all the options. I also changed the code to use connection
 pooling with only 2 connections but still there are bunch of
 CLOSE_WAITS. As soon as I stop tomcat all of them go away. I am not
 able to figure out why there are so many CLOSE_WAITS hanging around
 when I just have 2 connections in my pool.

 On Wed, Oct 8, 2008 at 6:14 AM, Konstantin Kolinko
 [EMAIL PROTECTED] wrote:
  2008/10/8 Mohit Anchlia [EMAIL PROTECTED]:
  I can see you can't wait to hear the debate. Anyhow, I am using
  HttpClient from apache commons and I do have .getReleaseConnection().
 
 
  See comment #22 here:
  https://issues.apache.org/bugzilla/show_bug.cgi?id=28727#c22
  and the message thread that it refers, [1]:
 
  [1]
 http://www.mail-archive.com/[EMAIL PROTECTED]/msg04338.html
 
  I do not know, if that applies to your case. That message thread ([1]) is
  of December 2003, thus I do not know whether it is still applicable.
 
  Just my 0.02 EUR.
 
  Best regards,
  Konstantin Kolinko
 
  -
  To start a new topic, e-mail: users@tomcat.apache.org
  To unsubscribe, e-mail: [EMAIL PROTECTED]
  For additional commands, e-mail: [EMAIL PROTECTED]
 
 

 -
 To start a new topic, e-mail: users@tomcat.apache.org
 To unsubscribe, e-mail: [EMAIL PROTECTED]
 For additional commands, e-mail: [EMAIL PROTECTED]




-
To start a new topic, e-mail: users@tomcat.apache.org
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: Too many open files

2008-10-07 Thread Johnny Kewl


- Original Message - 
From: Mohit Anchlia [EMAIL PROTECTED]

To: Tomcat Users List users@tomcat.apache.org
Sent: Tuesday, October 07, 2008 11:44 PM
Subject: Too many open files



Tomcat throws too many open files and when I do lsof I get bunch of:

java14130 root  935u  IPv4 30842592   TCP
localhost:41971-localhost:http (CLOSE_WAIT)
java14130 root  937u  IPv4 30841213   TCP
efeitws3.ptctax.intuit.com:41161-10.10.81.94:webcache (CLOSE_WAIT)
java14130 root  938u  IPv4 30841214   TCP
localhost:41162-localhost:http (CLOSE_WAIT)
java14130 root  939u  IPv4 30841220   TCP
localhost:41165-localhost:http (CLOSE_WAIT)
java14130 root  940u  IPv4 30842516   TCP
localhost:41927-localhost:http (CLOSE_WAIT)
java14130 root  941u  IPv4 30841226   TCP
localhost:41168-localhost:http (CLOSE_WAIT)
java14130 root  943u  IPv4 30841899   TCP
efeitws3.ptctax.intuit.com:41566-10.10.81.94:webcache (CLOSE_WAIT)
java14130 root  944u  IPv4 30841694   TCP
efeitws3.ptctax.intuit.com:41453-10.10.81.94:webcache (CLOSE_WAIT)
java14130 root  945u  IPv4 30841695   TCP
localhost:41454-localhost:http (CLOSE_WAIT)
java14130 root  946u  IPv4 30841900   TCP
localhost:41567-localhost:http (CLOSE_WAIT)
java14130 root  948u  IPv4 30842415   TCP
efeitws3.ptctax.intuit.com:41864-10.10.81.94:webcache (CLOSE_WAIT)
java14130 root  949u  IPv4 30842416   TCP
localhost:41865-localhost:http (CLOSE_WAIT)
java14130 root  950u  IPv4 30842419   TCP
localhost:41867-localhost:http (CLOSE_WAIT)
java14130 root  952u  IPv4 30850596   TCP
localhost:42484-localhost:http (CLOSE_WAIT)
java14130 root  953u  IPv4 30842760   TCP
localhost:42058-localhost:http (CLOSE_WAIT)
java14130 root  954u  IPv4 30842596   TCP
localhost:41974-localhost:http (CLOSE_WAIT)
java14130 root  956u  IPv4 30842093   TCP
localhost:41676-localhost:http (CLOSE_WAIT)
java14130 root  957u  IPv4 30842195   TCP
efeitws3.ptctax.intuit.com:41737-10.10.81.94:webcache (CLOSE_WAIT)
java14130 root  958u  IPv4 30841730   TCP
localhost:41467-localhost:http (CLOSE_WAIT)
java14130 root  959u  IPv4 30841737   TCP
localhost:41472-localhost:http (CLOSE_WAIT)
java14130 root  960u  IPv4 30842196   TCP
localhost:41738-localhost:http (CLOSE_WAIT)
java14130 root  961u  IPv4 30842528   TCP
localhost:41933-localhost:http (CLOSE_WAIT)
java14130 root  962u  IPv4 30842363   TCP
localhost:41836-localhost:http (CLOSE_WAIT)
java14130 root  964u  IPv4 30842365   TCP
efeitws3.ptctax.intuit.com:41837-10.10.81.94:webcache (CLOSE_WAIT)
java14130 root  965u  IPv4 30842366   TCP
localhost:41838-localhost:http (CLOSE_WAIT)
java14130 root  966u  IPv4 30842367   TCP
localhost:41839-localhost:http (CLOSE_WAIT)
java14130 root  967u  IPv4 30842371   TCP
localhost:41841-localhost:http (CLOSE_WAIT)
java14130 root  968u  IPv4 30842465   TCP
localhost:41895-localhost:http (CLOSE_WAIT)
java14130 root  969u  IPv4 30848501   TCP
localhost:42415-localhost:http (CLOSE_WAIT)
java14130 root  970u  IPv4 30842533   TCP
localhost:41936-localhost:http (CLOSE_WAIT)
java14130 root  971u  IPv4 30842468   TCP
localhost:41898-localhost:http (CLOSE_WAIT)
java14130 root  972u  IPv4 30842534   TCP
localhost:41937-localhost:http (CLOSE_WAIT)
java14130 root  973u  IPv4 30842765   TCP
localhost:42062-localhost:http (CLOSE_WAIT)
java14130 root  974u  IPv4 30842472   TCP
localhost:41901-localhost:http (CLOSE_WAIT)
java14130 root  975u  IPv4 30842122   TCP
localhost:41694-localhost:http (CLOSE_WAIT)
java14130 root  976u  IPv4 30842123   TCP
localhost:41695-localhost:http (CLOSE_WAIT)
java14130 root  977u  IPv4 30843217   TCP
localhost:42188-localhost:http (CLOSE_WAIT)
java14130 root  978u  IPv4 30842125   TCP
localhost:41696-localhost:http (CLOSE_WAIT)
java14130 root  979u  IPv4 30842126   TCP
efeitws3.ptctax.intuit.com:41697-10.10.81.94:webcache (CLOSE_WAIT)
java14130 root  981u  IPv4 30842128   TCP
efeitws3.ptctax.intuit.com:41698-10.10.81.94:webcache (CLOSE_WAIT)
java14130 root  982u  IPv4 30842129   TCP
localhost:41699-localhost:http (CLOSE_WAIT)
java14130 root  983u  IPv4 30888558   TCP
localhost:43218-localhost:http (CLOSE_WAIT)
java14130 root  984u  IPv4 30842617   TCP
localhost:41986-localhost:http (CLOSE_WAIT)
java14130 root  985u  IPv4 30842618   TCP
efeitws3.ptctax.intuit.com:41987-10.10.81.94:webcache (CLOSE_WAIT)
java14130 root  986u  IPv4 30844067 

Re: Too many open files

2008-10-07 Thread Mohit Anchlia
I can see you can't wait to hear the debate. Anyhow, I am using
HttpClient from apache commons and I do have .getReleaseConnection().

On Tue, Oct 7, 2008 at 4:56 PM, Johnny Kewl [EMAIL PROTECTED] wrote:

 - Original Message - From: Mohit Anchlia [EMAIL PROTECTED]
 To: Tomcat Users List users@tomcat.apache.org
 Sent: Tuesday, October 07, 2008 11:44 PM
 Subject: Too many open files


 Tomcat throws too many open files and when I do lsof I get bunch of:

 java14130 root  935u  IPv4 30842592   TCP
 localhost:41971-localhost:http (CLOSE_WAIT)
 java14130 root  937u  IPv4 30841213   TCP
 efeitws3.ptctax.intuit.com:41161-10.10.81.94:webcache (CLOSE_WAIT)
 java14130 root  938u  IPv4 30841214   TCP
 localhost:41162-localhost:http (CLOSE_WAIT)
 java14130 root  939u  IPv4 30841220   TCP
 localhost:41165-localhost:http (CLOSE_WAIT)
 java14130 root  940u  IPv4 30842516   TCP
 localhost:41927-localhost:http (CLOSE_WAIT)
 java14130 root  941u  IPv4 30841226   TCP
 localhost:41168-localhost:http (CLOSE_WAIT)
 java14130 root  943u  IPv4 30841899   TCP
 efeitws3.ptctax.intuit.com:41566-10.10.81.94:webcache (CLOSE_WAIT)
 java14130 root  944u  IPv4 30841694   TCP
 efeitws3.ptctax.intuit.com:41453-10.10.81.94:webcache (CLOSE_WAIT)
 java14130 root  945u  IPv4 30841695   TCP
 localhost:41454-localhost:http (CLOSE_WAIT)
 java14130 root  946u  IPv4 30841900   TCP
 localhost:41567-localhost:http (CLOSE_WAIT)
 java14130 root  948u  IPv4 30842415   TCP
 efeitws3.ptctax.intuit.com:41864-10.10.81.94:webcache (CLOSE_WAIT)
 java14130 root  949u  IPv4 30842416   TCP
 localhost:41865-localhost:http (CLOSE_WAIT)
 java14130 root  950u  IPv4 30842419   TCP
 localhost:41867-localhost:http (CLOSE_WAIT)
 java14130 root  952u  IPv4 30850596   TCP
 localhost:42484-localhost:http (CLOSE_WAIT)
 java14130 root  953u  IPv4 30842760   TCP
 localhost:42058-localhost:http (CLOSE_WAIT)
 java14130 root  954u  IPv4 30842596   TCP
 localhost:41974-localhost:http (CLOSE_WAIT)
 java14130 root  956u  IPv4 30842093   TCP
 localhost:41676-localhost:http (CLOSE_WAIT)
 java14130 root  957u  IPv4 30842195   TCP
 efeitws3.ptctax.intuit.com:41737-10.10.81.94:webcache (CLOSE_WAIT)
 java14130 root  958u  IPv4 30841730   TCP
 localhost:41467-localhost:http (CLOSE_WAIT)
 java14130 root  959u  IPv4 30841737   TCP
 localhost:41472-localhost:http (CLOSE_WAIT)
 java14130 root  960u  IPv4 30842196   TCP
 localhost:41738-localhost:http (CLOSE_WAIT)
 java14130 root  961u  IPv4 30842528   TCP
 localhost:41933-localhost:http (CLOSE_WAIT)
 java14130 root  962u  IPv4 30842363   TCP
 localhost:41836-localhost:http (CLOSE_WAIT)
 java14130 root  964u  IPv4 30842365   TCP
 efeitws3.ptctax.intuit.com:41837-10.10.81.94:webcache (CLOSE_WAIT)
 java14130 root  965u  IPv4 30842366   TCP
 localhost:41838-localhost:http (CLOSE_WAIT)
 java14130 root  966u  IPv4 30842367   TCP
 localhost:41839-localhost:http (CLOSE_WAIT)
 java14130 root  967u  IPv4 30842371   TCP
 localhost:41841-localhost:http (CLOSE_WAIT)
 java14130 root  968u  IPv4 30842465   TCP
 localhost:41895-localhost:http (CLOSE_WAIT)
 java14130 root  969u  IPv4 30848501   TCP
 localhost:42415-localhost:http (CLOSE_WAIT)
 java14130 root  970u  IPv4 30842533   TCP
 localhost:41936-localhost:http (CLOSE_WAIT)
 java14130 root  971u  IPv4 30842468   TCP
 localhost:41898-localhost:http (CLOSE_WAIT)
 java14130 root  972u  IPv4 30842534   TCP
 localhost:41937-localhost:http (CLOSE_WAIT)
 java14130 root  973u  IPv4 30842765   TCP
 localhost:42062-localhost:http (CLOSE_WAIT)
 java14130 root  974u  IPv4 30842472   TCP
 localhost:41901-localhost:http (CLOSE_WAIT)
 java14130 root  975u  IPv4 30842122   TCP
 localhost:41694-localhost:http (CLOSE_WAIT)
 java14130 root  976u  IPv4 30842123   TCP
 localhost:41695-localhost:http (CLOSE_WAIT)
 java14130 root  977u  IPv4 30843217   TCP
 localhost:42188-localhost:http (CLOSE_WAIT)
 java14130 root  978u  IPv4 30842125   TCP
 localhost:41696-localhost:http (CLOSE_WAIT)
 java14130 root  979u  IPv4 30842126   TCP
 efeitws3.ptctax.intuit.com:41697-10.10.81.94:webcache (CLOSE_WAIT)
 java14130 root  981u  IPv4 30842128   TCP
 efeitws3.ptctax.intuit.com:41698-10.10.81.94:webcache (CLOSE_WAIT)
 java14130 root  982u  IPv4 30842129   TCP
 localhost:41699-localhost:http (CLOSE_WAIT)
 java14130 root  983u  IPv4 30888558   TCP
 

Re: Too many open files

2008-10-07 Thread Johnny Kewl


- Original Message - 
From: Johnny Kewl [EMAIL PROTECTED]

To: Tomcat Users List users@tomcat.apache.org
Sent: Wednesday, October 08, 2008 1:56 AM
Subject: Re: Too many open files




- Original Message - 
From: Mohit Anchlia [EMAIL PROTECTED]

To: Tomcat Users List users@tomcat.apache.org
Sent: Tuesday, October 07, 2008 11:44 PM
Subject: Too many open files



Tomcat throws too many open files and when I do lsof I get bunch of:

java14130 root  935u  IPv4 30842592   TCP
localhost:41971-localhost:http (CLOSE_WAIT)
java14130 root  937u  IPv4 30841213   TCP
efeitws3.ptctax.intuit.com:41161-10.10.81.94:webcache (CLOSE_WAIT)
java14130 root  938u  IPv4 30841214   TCP
localhost:41162-localhost:http (CLOSE_WAIT)
java14130 root  939u  IPv4 30841220   TCP
localhost:41165-localhost:http (CLOSE_WAIT)
java14130 root  940u  IPv4 30842516   TCP
localhost:41927-localhost:http (CLOSE_WAIT)
java14130 root  941u  IPv4 30841226   TCP
localhost:41168-localhost:http (CLOSE_WAIT)
java14130 root  943u  IPv4 30841899   TCP
efeitws3.ptctax.intuit.com:41566-10.10.81.94:webcache (CLOSE_WAIT)
java14130 root  944u  IPv4 30841694   TCP
efeitws3.ptctax.intuit.com:41453-10.10.81.94:webcache (CLOSE_WAIT)
java14130 root  945u  IPv4 30841695   TCP
localhost:41454-localhost:http (CLOSE_WAIT)
java14130 root  946u  IPv4 30841900   TCP
localhost:41567-localhost:http (CLOSE_WAIT)
java14130 root  948u  IPv4 30842415   TCP
efeitws3.ptctax.intuit.com:41864-10.10.81.94:webcache (CLOSE_WAIT)
java14130 root  949u  IPv4 30842416   TCP
localhost:41865-localhost:http (CLOSE_WAIT)
java14130 root  950u  IPv4 30842419   TCP
localhost:41867-localhost:http (CLOSE_WAIT)
java14130 root  952u  IPv4 30850596   TCP
localhost:42484-localhost:http (CLOSE_WAIT)
java14130 root  953u  IPv4 30842760   TCP
localhost:42058-localhost:http (CLOSE_WAIT)
java14130 root  954u  IPv4 30842596   TCP
localhost:41974-localhost:http (CLOSE_WAIT)
java14130 root  956u  IPv4 30842093   TCP
localhost:41676-localhost:http (CLOSE_WAIT)
java14130 root  957u  IPv4 30842195   TCP
efeitws3.ptctax.intuit.com:41737-10.10.81.94:webcache (CLOSE_WAIT)
java14130 root  958u  IPv4 30841730   TCP
localhost:41467-localhost:http (CLOSE_WAIT)
java14130 root  959u  IPv4 30841737   TCP
localhost:41472-localhost:http (CLOSE_WAIT)
java14130 root  960u  IPv4 30842196   TCP
localhost:41738-localhost:http (CLOSE_WAIT)
java14130 root  961u  IPv4 30842528   TCP
localhost:41933-localhost:http (CLOSE_WAIT)
java14130 root  962u  IPv4 30842363   TCP
localhost:41836-localhost:http (CLOSE_WAIT)
java14130 root  964u  IPv4 30842365   TCP
efeitws3.ptctax.intuit.com:41837-10.10.81.94:webcache (CLOSE_WAIT)
java14130 root  965u  IPv4 30842366   TCP
localhost:41838-localhost:http (CLOSE_WAIT)
java14130 root  966u  IPv4 30842367   TCP
localhost:41839-localhost:http (CLOSE_WAIT)
java14130 root  967u  IPv4 30842371   TCP
localhost:41841-localhost:http (CLOSE_WAIT)
java14130 root  968u  IPv4 30842465   TCP
localhost:41895-localhost:http (CLOSE_WAIT)
java14130 root  969u  IPv4 30848501   TCP
localhost:42415-localhost:http (CLOSE_WAIT)
java14130 root  970u  IPv4 30842533   TCP
localhost:41936-localhost:http (CLOSE_WAIT)
java14130 root  971u  IPv4 30842468   TCP
localhost:41898-localhost:http (CLOSE_WAIT)
java14130 root  972u  IPv4 30842534   TCP
localhost:41937-localhost:http (CLOSE_WAIT)
java14130 root  973u  IPv4 30842765   TCP
localhost:42062-localhost:http (CLOSE_WAIT)
java14130 root  974u  IPv4 30842472   TCP
localhost:41901-localhost:http (CLOSE_WAIT)
java14130 root  975u  IPv4 30842122   TCP
localhost:41694-localhost:http (CLOSE_WAIT)
java14130 root  976u  IPv4 30842123   TCP
localhost:41695-localhost:http (CLOSE_WAIT)
java14130 root  977u  IPv4 30843217   TCP
localhost:42188-localhost:http (CLOSE_WAIT)
java14130 root  978u  IPv4 30842125   TCP
localhost:41696-localhost:http (CLOSE_WAIT)
java14130 root  979u  IPv4 30842126   TCP
efeitws3.ptctax.intuit.com:41697-10.10.81.94:webcache (CLOSE_WAIT)
java14130 root  981u  IPv4 30842128   TCP
efeitws3.ptctax.intuit.com:41698-10.10.81.94:webcache (CLOSE_WAIT)
java14130 root  982u  IPv4 30842129   TCP
localhost:41699-localhost:http (CLOSE_WAIT)
java14130 root  983u  IPv4 30888558   TCP
localhost:43218-localhost:http (CLOSE_WAIT)
java14130 root  984u  IPv4 30842617   TCP
localhost:41986

Re: Too many open files

2008-10-07 Thread Johnny Kewl


- Original Message - 
From: Mohit Anchlia [EMAIL PROTECTED]

To: Tomcat Users List users@tomcat.apache.org
Sent: Wednesday, October 08, 2008 2:11 AM
Subject: Re: Too many open files



I can see you can't wait to hear the debate. Anyhow, I am using
HttpClient from apache commons and I do have .getReleaseConnection().


The brain surgeons are arriving... 


... dont know if thats the cause but its possible
Set the instance to null...
HttpClient = null;

when you done with it... they may do the trick...

Have fun...

---
HARBOR : http://www.kewlstuff.co.za/index.htm
The most powerful application server on earth.
The only real POJO Application Server.
See it in Action : http://www.kewlstuff.co.za/cd_tut_swf/whatisejb1.htm
---
If you cant pay in gold... get lost...



-
To start a new topic, e-mail: users@tomcat.apache.org
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




Re: Too many open files exception under heavy load - need help!

2008-01-25 Thread Rainer Traut

Tobias Schulz-Hess schrieb:

For Linux, this can be done dynamically by launching (fron the OS



prompt):




 echo 16384 /proc/sys/fs/file-max

When I do
~# cat /proc/sys/fs/file-max
203065


This setting is a kernel limit.


This tells me, that (at least this specific setting) is already
sufficient...


You most likely hit shell limits.

What user runs your tomcat server?

When you have found out, go to
/etc/security/limits.conf
and adjust parameters like this (or according to your needs):

tomcat   softnofile  9
tomcat   hardnofile  9

tomcat   softnproc   8192
tomcat   hardnproc   8192


You can check these limits after relogin with your tomcat user with 
'ulimit -a'.


Rainer

-
To start a new topic, e-mail: users@tomcat.apache.org
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: Too many open files exception under heavy load - need help!

2008-01-25 Thread Tobias Schulz-Hess
Hi Rainer,

Rainer Jung schrieb:
 Hi,

 1) How many fds does the process have, so is the question why can't
 we use all those 4096 fds configured, or is it Where do those 4096
 fdsused by my process come from?
The latter. We can actually see the 4096 fds are used (by port 8080 in
CLOSE_WAIT state...).
Well, we're pretty sure that the fds actually are the connections from
the HTTPConnector of Tomcat. The connector is set to use 200 connections
simultaneously. So the question is: Why aren't those connections closed?...



 2) CLOSE_WAIT means the remote side closed the connection and the
 local side didn't yet close it. What's you remote side with respect to
 TCP? Is it browsers, or a load balancer or stuff like that?
We have NGINX as a proxy in front of the tomcat (on another server). So
request from the Internet arrive at NGINX and are then forwarded to the
tomcat(s).
By now, we're pretty happy with NGINX, since it is really fast and has
low footprint, but could well be that it does not work well with tomcat.

We have the problems with our live servers, so the application, which
actually is initiating the connection is a browser.


 3) Are you using keep alive (not implying that's the cause of your
 problems, but keep alive makes the connection live cycle much more
 complicated from the container point of view).
As far as I understood NGINX, we only use keep alive request for the
communication between client and NGINX. The communication between NGINX
and tomcat does not have settings for keep alive, so I assume: no.

This is the relevant part of the NGINX configuration:

location / {
proxy_pass http://verwandt_de;
proxy_redirect off;
   
proxy_set_header   Host $host;
proxy_set_header   X-Real-IP$remote_addr;
proxy_set_header   X-Forwarded-For 
$proxy_add_x_forwarded_for;
   
client_max_body_size   10m;
client_body_temp_path 
/var/nginx/client_body_temp;
   
proxy_buffering off;
proxy_store off;
   
proxy_connect_timeout  30;
proxy_send_timeout 80;
proxy_read_timeout 80;
}
 

So any suggestions that I should move the topic forward to some NGINX
mailing list?

Kind regards,

Tobias.


 Regards,
 Rainer


 Tobias Schulz-Hess wrote:
 Hi there,

 we use the current Tomcat 6.0 on 2 machines. The hardware is brand
 new and is really fast. We get lots of traffic which is usually
 handled well by the tomcats and the load on those machines is between
 1 and 6 (when we have lots of traffic).
 The machines have debian 4.1/64 as OS.

 However, sometimes (especially if we have lots of traffic) we get the
 following exception:
 INFO   | jvm 1| 2008/01/23 15:28:18 | java.net.SocketException:
 Too many open files
 INFO   | jvm 1| 2008/01/23 15:28:18 |   at
 java.net.PlainSocketImpl.socketAccept(Native Method)
 INFO   | jvm 1| 2008/01/23 15:28:18 |   at
 java.net.PlainSocketImpl.accept(PlainSocketImpl.java:384)
 INFO   | jvm 1| 2008/01/23 15:28:18 |   at
 java.net.ServerSocket.implAccept(ServerSocket.java:453)
 INFO   | jvm 1| 2008/01/23 15:28:18 |   at
 java.net.ServerSocket.accept(ServerSocket.java:421)
 INFO   | jvm 1| 2008/01/23 15:28:18 |   at
 org.apache.tomcat.util.net.DefaultServerSocketFactory.acceptSocket(DefaultServe

 rSocketFactory.java:61)
 INFO   | jvm 1| 2008/01/23 15:28:18 |   at
 org.apache.tomcat.util.net.JIoEndpoint$Acceptor.run(JIoEndpoint.java:310)

 INFO   | jvm 1| 2008/01/23 15:28:18 |   at
 java.lang.Thread.run(Thread.java:619)
 I

 We already have altered the ulimit from 1024 (default) to 4096 (and
 therefore proofing: yes, I have used google and read almost
 everything about that exception).

 We also looked into the open files and all 95% of them are from or to
 the Tomcat Port 8080. (The other 5% are open JARs, connections to
 memcached and MySQL and SSL-Socket).

 Most of the connections to port 8080 are in the CLOSE_WAIT state.

 I have the strong feeling that something (tomcat, JVM, whatsoever)
 relies that the JVM garbage collection will kill those open
 connections. However, if we have heavy load, the garbage collection
 is suspended and then the connections pile up. But this is just a guess.

 How can this problem be solved?

 Thank you and kind regards,

 Tobias.

 ---
 Tobias Schulz-Hess

 -
 To start a new topic, e-mail: users@tomcat.apache.org
 To unsubscribe, e-mail: [EMAIL PROTECTED]
 For additional commands, e-mail: [EMAIL PROTECTED]



Re: Too many open files exception under heavy load - need help!

2008-01-25 Thread Rainer Jung

Tobias Schulz-Hess wrote:

Hi Rainer,

Rainer Jung schrieb:

Hi,

1) How many fds does the process have, so is the question why can't
we use all those 4096 fds configured, or is it Where do those 4096
fdsused by my process come from?

The latter. We can actually see the 4096 fds are used (by port 8080 in
CLOSE_WAIT state...).
Well, we're pretty sure that the fds actually are the connections from
the HTTPConnector of Tomcat. The connector is set to use 200 connections
simultaneously. So the question is: Why aren't those connections closed?...


Are you using the tcnative APR connector?


2) CLOSE_WAIT means the remote side closed the connection and the
local side didn't yet close it. What's you remote side with respect to
TCP? Is it browsers, or a load balancer or stuff like that?

We have NGINX as a proxy in front of the tomcat (on another server). So
request from the Internet arrive at NGINX and are then forwarded to the
tomcat(s).
By now, we're pretty happy with NGINX, since it is really fast and has
low footprint, but could well be that it does not work well with tomcat.

We have the problems with our live servers, so the application, which
actually is initiating the connection is a browser.


3) Are you using keep alive (not implying that's the cause of your
problems, but keep alive makes the connection live cycle much more
complicated from the container point of view).

As far as I understood NGINX, we only use keep alive request for the
communication between client and NGINX. The communication between NGINX
and tomcat does not have settings for keep alive, so I assume: no.

This is the relevant part of the NGINX configuration:

location / {
proxy_pass http://verwandt_de;
proxy_redirect off;
   
proxy_set_header   Host $host;

proxy_set_header   X-Real-IP$remote_addr;
proxy_set_header   X-Forwarded-For 
$proxy_add_x_forwarded_for;
   
client_max_body_size   10m;
client_body_temp_path 
/var/nginx/client_body_temp;
   
proxy_buffering off;

proxy_store off;
   
proxy_connect_timeout  30;

proxy_send_timeout 80;
proxy_read_timeout 80;
}
 


So any suggestions that I should move the topic forward to some NGINX
mailing list?


Not sure yet. It's interesting, that there is a 30 seconds timeout in 
this config. Maybe you should investigate, what those 30 seconds.mean. 
On the other hand, 30 seconds are not that rarely used as defaults ...


What about experimenting with maxKeepAliveRequests=1 in your http 
connector (server.xml)?




Kind regards,

Tobias.


Regards,
Rainer


Tobias Schulz-Hess wrote:

Hi there,

we use the current Tomcat 6.0 on 2 machines. The hardware is brand
new and is really fast. We get lots of traffic which is usually
handled well by the tomcats and the load on those machines is between
1 and 6 (when we have lots of traffic).
The machines have debian 4.1/64 as OS.

However, sometimes (especially if we have lots of traffic) we get the
following exception:
INFO   | jvm 1| 2008/01/23 15:28:18 | java.net.SocketException:
Too many open files
INFO   | jvm 1| 2008/01/23 15:28:18 |   at
java.net.PlainSocketImpl.socketAccept(Native Method)
INFO   | jvm 1| 2008/01/23 15:28:18 |   at
java.net.PlainSocketImpl.accept(PlainSocketImpl.java:384)
INFO   | jvm 1| 2008/01/23 15:28:18 |   at
java.net.ServerSocket.implAccept(ServerSocket.java:453)
INFO   | jvm 1| 2008/01/23 15:28:18 |   at
java.net.ServerSocket.accept(ServerSocket.java:421)
INFO   | jvm 1| 2008/01/23 15:28:18 |   at
org.apache.tomcat.util.net.DefaultServerSocketFactory.acceptSocket(DefaultServe

rSocketFactory.java:61)
INFO   | jvm 1| 2008/01/23 15:28:18 |   at
org.apache.tomcat.util.net.JIoEndpoint$Acceptor.run(JIoEndpoint.java:310)

INFO   | jvm 1| 2008/01/23 15:28:18 |   at
java.lang.Thread.run(Thread.java:619)
I

We already have altered the ulimit from 1024 (default) to 4096 (and
therefore proofing: yes, I have used google and read almost
everything about that exception).

We also looked into the open files and all 95% of them are from or to
the Tomcat Port 8080. (The other 5% are open JARs, connections to
memcached and MySQL and SSL-Socket).

Most of the connections to port 8080 are in the CLOSE_WAIT state.

I have the strong feeling that something (tomcat, JVM, whatsoever)
relies that the JVM garbage collection will kill those open
connections. However, if we have heavy load, the garbage collection
is suspended and then the connections pile up. But this is just a guess.

How can this problem be solved?

Thank you and kind regards,

Tobias.


Re: Too many open files exception under heavy load - need help!

2008-01-24 Thread Bruno Vilardo
Tobias,

You probably need to tune some kernel paramerters. I had some issues
with our application get stuck at some point that we needed to
restart everything. And since you said it is a brend new server, you
might have the defalt values set in there.

What Does uname -a say?

The kernel parameter controlling that changes from one UNIX flavor to
the next; generally it's named NFILES, MAXFILES or NINODE. I usually
tune these parameter for our Progress databases.
For Linux, this can be done dynamically by launching (fron the OS
prompt):

 echo 16384 /proc/sys/fs/file-max

Regards,

Bruno

On Jan 24, 2008 10:26 PM, Tobias Schulz-Hess
[EMAIL PROTECTED] wrote:
 Hi there,

 we use the current Tomcat 6.0 on 2 machines. The hardware is brand new and is 
 really fast. We get lots of traffic which is usually handled well by the 
 tomcats and the load on those machines is between 1 and 6 (when we have lots 
 of traffic).
 The machines have debian 4.1/64 as OS.

 However, sometimes (especially if we have lots of traffic) we get the 
 following exception:
 INFO   | jvm 1| 2008/01/23 15:28:18 | java.net.SocketException: Too many 
 open files
 INFO   | jvm 1| 2008/01/23 15:28:18 |   at 
 java.net.PlainSocketImpl.socketAccept(Native Method)
 INFO   | jvm 1| 2008/01/23 15:28:18 |   at 
 java.net.PlainSocketImpl.accept(PlainSocketImpl.java:384)
 INFO   | jvm 1| 2008/01/23 15:28:18 |   at 
 java.net.ServerSocket.implAccept(ServerSocket.java:453)
 INFO   | jvm 1| 2008/01/23 15:28:18 |   at 
 java.net.ServerSocket.accept(ServerSocket.java:421)
 INFO   | jvm 1| 2008/01/23 15:28:18 |   at 
 org.apache.tomcat.util.net.DefaultServerSocketFactory.acceptSocket(DefaultServe
 rSocketFactory.java:61)
 INFO   | jvm 1| 2008/01/23 15:28:18 |   at 
 org.apache.tomcat.util.net.JIoEndpoint$Acceptor.run(JIoEndpoint.java:310)
 INFO   | jvm 1| 2008/01/23 15:28:18 |   at 
 java.lang.Thread.run(Thread.java:619)
 I

 We already have altered the ulimit from 1024 (default) to 4096 (and therefore 
 proofing: yes, I have used google and read almost everything about that 
 exception).

 We also looked into the open files and all 95% of them are from or to the 
 Tomcat Port 8080. (The other 5% are open JARs, connections to memcached and 
 MySQL and SSL-Socket).

 Most of the connections to port 8080 are in the CLOSE_WAIT state.

 I have the strong feeling that something (tomcat, JVM, whatsoever) relies 
 that the JVM garbage collection will kill those open connections. However, if 
 we have heavy load, the garbage collection is suspended and then the 
 connections pile up. But this is just a guess.

 How can this problem be solved?

 Thank you and kind regards,

 Tobias.

 ---
 Tobias Schulz-Hess

 ICS - Internet Consumer Services GmbH
 Mittelweg 162
 20148 Hamburg

 Tel:+49 (0) 40 238 49 141
 Fax:+49 (0) 40 415 457 14
 E-Mail: [EMAIL PROTECTED]
 Web:www.internetconsumerservices.com

 Projekte
 www.dealjaeger.de
 www.verwandt.de

 ICS Internet Consumer Services GmbH
 Geschäftsführer: Dipl.-Kfm. Daniel Grözinger, Dipl.-Kfm. Sven Schmidt
 Handelsregister: Amtsgericht Hamburg HRB 95149




-
To start a new topic, e-mail: users@tomcat.apache.org
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: Too Many Open Files error

2007-10-20 Thread Filip Hanik - Dev Lists

usually it is added at

/etc/security/limits.conf

Filip

Nix Hanwei wrote:

Hi Dan,

You can try sysctrl.conf file.  Add in the ulimit -n for open files.

- Original Message 
From: Daniel M Garland [EMAIL PROTECTED]
To: Tomcat Users List users@tomcat.apache.org
Sent: Friday, 19 October 2007 9:43:00
Subject: Re: Too Many Open Files error

Thanks Jim,

It was previously set to 1024, and I quadrupled it. When you say ulimit
 
is persistent will it persist across a reboot?


I don't seem to have the command lsof, I'll try and apt-get it.

Cheers
Dan

Jim Cox wrote:
  

On 10/19/07, Daniel M Garland [EMAIL PROTECTED] wrote:


Should I then place ulimit -n  in the catalina startup scripts?
  

Setting a limit with ulimit is sticky (i.e. persistent), so there's
no need to stick it in the startup script.

However, you didn't answer the previous two questions about (1) how
many files did Tomcat have open when you got the Too many open


 files
  

error, and (2) what the current ulimit setting for open files is. If
you provide those answers people here can help you out a bit more.

-
To start a new topic, e-mail: users@tomcat.apache.org
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]





 __
  

This email has been scanned by the MessageLabs Email Security System.
For more information please visit http://www.messagelabs.com/email 



 __
  


__
This email has been scanned by the MessageLabs Email Security System.
For more information please visit http://www.messagelabs.com/email 
__


-
To start a new topic, e-mail: users@tomcat.apache.org
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]






  
__ 
Yahoo! Movies - Search movie info and celeb profiles and photos. 
http://sg.movies.yahoo.com/


-
To start a new topic, e-mail: users@tomcat.apache.org
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



  



-
To start a new topic, e-mail: users@tomcat.apache.org
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: Too Many Open Files error

2007-10-19 Thread Daniel M Garland

Thanks Jim,

It was previously set to 1024, and I quadrupled it. When you say ulimit 
is persistent will it persist across a reboot?


I don't seem to have the command lsof, I'll try and apt-get it.

Cheers
Dan

Jim Cox wrote:

On 10/19/07, Daniel M Garland [EMAIL PROTECTED] wrote:

Should I then place ulimit -n  in the catalina startup scripts?


Setting a limit with ulimit is sticky (i.e. persistent), so there's
no need to stick it in the startup script.

However, you didn't answer the previous two questions about (1) how
many files did Tomcat have open when you got the Too many open files
error, and (2) what the current ulimit setting for open files is. If
you provide those answers people here can help you out a bit more.

-
To start a new topic, e-mail: users@tomcat.apache.org
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


__
This email has been scanned by the MessageLabs Email Security System.
For more information please visit http://www.messagelabs.com/email 
__




__
This email has been scanned by the MessageLabs Email Security System.
For more information please visit http://www.messagelabs.com/email 
__


-
To start a new topic, e-mail: users@tomcat.apache.org
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: Too Many Open Files error

2007-10-19 Thread Nix Hanwei
Hi Dan,

You can try sysctrl.conf file.  Add in the ulimit -n for open files.

- Original Message 
From: Daniel M Garland [EMAIL PROTECTED]
To: Tomcat Users List users@tomcat.apache.org
Sent: Friday, 19 October 2007 9:43:00
Subject: Re: Too Many Open Files error

Thanks Jim,

It was previously set to 1024, and I quadrupled it. When you say ulimit
 
is persistent will it persist across a reboot?

I don't seem to have the command lsof, I'll try and apt-get it.

Cheers
Dan

Jim Cox wrote:
 On 10/19/07, Daniel M Garland [EMAIL PROTECTED] wrote:
 Should I then place ulimit -n  in the catalina startup scripts?
 
 Setting a limit with ulimit is sticky (i.e. persistent), so there's
 no need to stick it in the startup script.
 
 However, you didn't answer the previous two questions about (1) how
 many files did Tomcat have open when you got the Too many open
 files
 error, and (2) what the current ulimit setting for open files is. If
 you provide those answers people here can help you out a bit more.
 
 -
 To start a new topic, e-mail: users@tomcat.apache.org
 To unsubscribe, e-mail: [EMAIL PROTECTED]
 For additional commands, e-mail: [EMAIL PROTECTED]
 
 

 __
 This email has been scanned by the MessageLabs Email Security System.
 For more information please visit http://www.messagelabs.com/email 

 __
 

__
This email has been scanned by the MessageLabs Email Security System.
For more information please visit http://www.messagelabs.com/email 
__

-
To start a new topic, e-mail: users@tomcat.apache.org
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]






  
__ 
Yahoo! Movies - Search movie info and celeb profiles and photos. 
http://sg.movies.yahoo.com/

-
To start a new topic, e-mail: users@tomcat.apache.org
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: Too Many Open Files error

2007-10-19 Thread Jim Cox
On 10/19/07, Daniel M Garland [EMAIL PROTECTED] wrote:
 Thanks Jim,

 It was previously set to 1024, and I quadrupled it. When you say ulimit
 is persistent will it persist across a reboot?

 I don't seem to have the command lsof, I'll try and apt-get it.

 Cheers
 Dan

The settings should persist, but there's a chance that a startup
script sets it after every reboot (e.g. Fedora has (or at least had) a
line to disable core file generation in /etc/profile, I think). Easy
enough to test, assuming you can reboot the box.

Besides lsof, a quick and dirty way to count the number of open files
for a process (in this case firefox, use Tomcat's pid in place of
$(pgrep firefox) in your case):
  example-prompt$ ls /proc/$(pgrep firefox)/fd | wc -l
  75

You didn't supply much detail about Tomcat's usage, but if the open
file limit is indeed 1024 (and you don't have a heavily-used server)
you might be leaking file handles somewhere in your JSP pages.

-
To start a new topic, e-mail: users@tomcat.apache.org
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: Too Many Open Files error

2007-10-19 Thread Jim Cox
On 10/19/07, Daniel M Garland [EMAIL PROTECTED] wrote:
 Should I then place ulimit -n  in the catalina startup scripts?

Setting a limit with ulimit is sticky (i.e. persistent), so there's
no need to stick it in the startup script.

However, you didn't answer the previous two questions about (1) how
many files did Tomcat have open when you got the Too many open files
error, and (2) what the current ulimit setting for open files is. If
you provide those answers people here can help you out a bit more.

-
To start a new topic, e-mail: users@tomcat.apache.org
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: Too Many Open Files error

2007-10-19 Thread Daniel M Garland

Should I then place ulimit -n  in the catalina startup scripts?

Jim Cox wrote:

On 10/18/07, Daniel M Garland [EMAIL PROTECTED] wrote:

Hi all

I'm seeing a problem on a Tomcat instance:

18-Oct-2007 12:41:47 org.apache.tomcat.util.net.AprEndpoint$Acceptor run
SEVERE: Socket accept failed
org.apache.tomcat.jni.Error: Too many open files
 at org.apache.tomcat.jni.Socket.accept(Native Method)
 at
org.apache.tomcat.util.net.AprEndpoint$Acceptor.run(AprEndpoint.java:1001)
 at java.lang.Thread.run(Thread.java:595)

Looking through Google and the advice seems to increase the number of
file descriptors. I'm on debian etch and

cat /proc/sys/fs/file-max
gives
369540


Relevant thing is probably the open file limit for processes -- what
does ulimit -a return for whatever user you're running Tomcat under?

-
To start a new topic, e-mail: users@tomcat.apache.org
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


__
This email has been scanned by the MessageLabs Email Security System.
For more information please visit http://www.messagelabs.com/email 
__




__
This email has been scanned by the MessageLabs Email Security System.
For more information please visit http://www.messagelabs.com/email 
__


-
To start a new topic, e-mail: users@tomcat.apache.org
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: Too Many Open Files error

2007-10-18 Thread Peter Bauer
Am Donnerstag 18 Oktober 2007 schrieb Daniel M Garland:
 Hi all

 I'm seeing a problem on a Tomcat instance:

 18-Oct-2007 12:41:47 org.apache.tomcat.util.net.AprEndpoint$Acceptor run
 SEVERE: Socket accept failed
 org.apache.tomcat.jni.Error: Too many open files
  at org.apache.tomcat.jni.Socket.accept(Native Method)
  at
 org.apache.tomcat.util.net.AprEndpoint$Acceptor.run(AprEndpoint.java:1001)
  at java.lang.Thread.run(Thread.java:595)

 Looking through Google and the advice seems to increase the number of
 file descriptors. I'm on debian etch and

 cat /proc/sys/fs/file-max
 gives
 369540

 I don't believe that I've hit this limit or that increasing this value
 would be sensible.

 Given that everything in linux is a file, does this mean that
 connections are not being closed properly? Where would be a good place
 to start debugging this problem?

 Thanks in advance
 Dan

 __
 This email has been scanned by the MessageLabs Email Security System.
 For more information please visit http://www.messagelabs.com/email
 __

 -
 To start a new topic, e-mail: users@tomcat.apache.org
 To unsubscribe, e-mail: [EMAIL PROTECTED]
 For additional commands, e-mail: [EMAIL PROTECTED]

Hi Dan,

try using lsof to check which files or network connections (also counts) you 
have opened.

br,
Peter

-- 
Peter Bauer
APUS Software G.m.b.H.
A-8074 Raaba, Bahnhofstrasse 1/1
Email: [EMAIL PROTECTED]
Tel: +43 316 401629 24
Fax: +43 316 401629 9

-
To start a new topic, e-mail: users@tomcat.apache.org
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



Re: Too Many Open Files error

2007-10-18 Thread Jim Cox
On 10/18/07, Daniel M Garland [EMAIL PROTECTED] wrote:
 Hi all

 I'm seeing a problem on a Tomcat instance:

 18-Oct-2007 12:41:47 org.apache.tomcat.util.net.AprEndpoint$Acceptor run
 SEVERE: Socket accept failed
 org.apache.tomcat.jni.Error: Too many open files
  at org.apache.tomcat.jni.Socket.accept(Native Method)
  at
 org.apache.tomcat.util.net.AprEndpoint$Acceptor.run(AprEndpoint.java:1001)
  at java.lang.Thread.run(Thread.java:595)

 Looking through Google and the advice seems to increase the number of
 file descriptors. I'm on debian etch and

 cat /proc/sys/fs/file-max
 gives
 369540

Relevant thing is probably the open file limit for processes -- what
does ulimit -a return for whatever user you're running Tomcat under?

-
To start a new topic, e-mail: users@tomcat.apache.org
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]