Re: Long Polling : Tomcat 7.0.50 / 8.0.9

2014-08-29 Thread anurag gupta
Can anyone help regarding this ?

Update:--

A simple test on Tomcat 7.0.50/7.0.55 of Longpolling implementation using
JAX-RS 2.0 AsyncResponse mechanism.
I'm seeing the following the errors in the logs and a lot many CLOSE_WAIT
connections, Why ?

Exception in thread http-nio-8080-ClientPoller-0
java.util.ConcurrentModificationException
at java.util.HashMap$HashIterator.nextEntry(HashMap.java:926)
at java.util.HashMap$KeyIterator.next(HashMap.java:960)
at
java.util.Collections$UnmodifiableCollection$1.next(Collections.java:1067)
at
org.apache.tomcat.util.net.NioEndpoint$Poller.timeout(NioEndpoint.java:1437)
at
org.apache.tomcat.util.net.NioEndpoint$Poller.run(NioEndpoint.java:1231)
at java.lang.Thread.run(Thread.java:744)

[ERROR] [2014-08-29 05:54:19,622] [-8080-ClientPoller-0]
[he.tomcat.util.net.NioEndpoint] Error allocating socket processor
java.lang.NullPointerException
at
org.apache.tomcat.util.net.NioEndpoint.processSocket(NioEndpoint.java:742)
at
org.apache.tomcat.util.net.NioEndpoint$Poller.processKey(NioEndpoint.java:1273)
at
org.apache.tomcat.util.net.NioEndpoint$Poller.run(NioEndpoint.java:1226)
at java.lang.Thread.run(Thread.java:744)

[ERROR] [2014-08-29 05:48:35,941] [-8080-ClientPoller-1]
[he.tomcat.util.net.NioEndpoint] Error allocating socket processor
java.lang.NullPointerException
at
org.apache.tomcat.util.net.NioEndpoint.processSocket(NioEndpoint.java:742)
at
org.apache.tomcat.util.net.NioEndpoint$Poller.processKey(NioEndpoint.java:1273)
at
org.apache.tomcat.util.net.NioEndpoint$Poller.run(NioEndpoint.java:1226)
at java.lang.Thread.run(Thread.java:744)

Exception in thread http-nio-8080-ClientPoller-0
java.util.ConcurrentModificationException
at java.util.HashMap$HashIterator.nextEntry(HashMap.java:926)
at java.util.HashMap$KeyIterator.next(HashMap.java:960)
at
java.util.Collections$UnmodifiableCollection$1.next(Collections.java:1067)
at
org.apache.tomcat.util.net.NioEndpoint$Poller.timeout(NioEndpoint.java:1437)
at
org.apache.tomcat.util.net.NioEndpoint$Poller.run(NioEndpoint.java:1231)
at java.lang.Thread.run(Thread.java:744)






On Fri, Aug 22, 2014 at 5:25 PM, anurag gupta anurag.11...@gmail.com
wrote:

 Ok, So the requests will be idle upto the long poll timeout if no response
 is generated.

 So in our test setup we have 60 clients and each makes 5000 requests.
  These 5000 requests are made at the same time and renewed(i.e. a new
 request is made in a loop ) as soon as
 the app server sends response (which in the worst case i.e no response was
 available, will be a empty json)

 ​A few minutes back I tried with processorCache=50, but still
 tomcat(8.0.9) logged OOM GC Overhead Limit Exceeded and on server around
 70K sockets were open (from /proc/net/sockstat​) .





 On Fri, Aug 22, 2014 at 5:03 PM, Mark Thomas ma...@apache.org wrote:

 On 22/08/2014 11:22, anurag gupta wrote:
  Executors:-
 
   Executor name=tomcatThreadPool namePrefix=catalina-exec-
  maxThreads=2048 minSpareThreads=1024 maxQueueSize=1
  prestartminSpareThreads=true/
 
  This is the connector config :-
 
  Connector port=8080
  protocol=org.apache.coyote.http11.Http11NioProtocol
 redirectPort=8443
  acceptCount=10 maxConnections=-1
 acceptorThreadCount=5 executor=tomcatThreadPool
  connectionLinger=-1 socket.soLingerOn=false socket.soLingerTime=0
 socket.soReuseAddress=true connectionTimeout=1000
  socket.soTimeout=1000 keepAliveTimeout=0 maxKeepAliveRequests=1
 socket.soKeepAlive=false /
 
  The only way to know for sure is if you use a profiler and find out
  where you application is using memory.
 
  Yes we do cache the AsyncResponse objects till the timeout happens or
 some
  response is generated.
 
  How long does a request take to process? Exactly how many concurrent
  requests are you trying to support?
  A long poll request has a timeout of 10 mins (in this test), but we
 want to
  have it upto 60 mins if feasible.
  We are trying to figure out the max achievable concurrent requests.

 Concurrent requests != concurrent connections.

 Concurrent requests (i.e. where the server is actively doing something
 with a connection) will be limited to 2048 with that configuration
 (maximum number of available threads).

 Concurrent connections will depend on you test environment. For a single
 Tomcat HTTP connector, there is a hard limit of 64k connections per
 client but you can use multiple clients (each with their own IP address)
 to get around that. After that, you'll hit OS limits - that should be
 around several hundred k.

 Mark


 -
 To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
 For additional commands, e-mail: users-h...@tomcat.apache.org




 --
 Regards
 Anurag




-- 
Regards
Anurag


Re: Long Polling : Tomcat 7.0.50 / 8.0.9

2014-08-22 Thread Mark Thomas
On 22/08/2014 06:03, anurag gupta wrote:


 Hi All,

  I'm trying to implement long polling using the servlet 3.0 spec.
 Implementation wise it's done and works fine in tomcat. The problem occurs
 when it is under load, for eg. when we send just 100,000 requests we see
 weird behaviour like requests timeout before the defined timeout, Tomcat
 goes OOM because of GC overhead limit exceeding.

The root cause of the OOM is most likely your application rather than
Tomcat.

 I have tried this on 2 diff versions of tomcat (mentioned in subject).

 OS CentOS 6.5
 Process memory 10g both Xmx and Xms

 So I have a question, upto how many concurrent open(idle) connections can
 a tomcat instance handle ?

As many as your operating system will allow. (Hint: It will be less than
100k).

 How to achieve maximum idle connections ?

Fix your application so it doesn't trigger an OOME.

Tune your OS.

Mark

-
To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
For additional commands, e-mail: users-h...@tomcat.apache.org



Re: Long Polling : Tomcat 7.0.50 / 8.0.9

2014-08-22 Thread anurag gupta
Thanks Mark.

The same application is running in a jetty9 server. And I ran a test for 5
hours with 300,000 requests (moving window of 9mins) with 10g of heap.
Jetty didn't crash with OOM. So I guess my application is not the source of
OOM.

I'm currently using tomcat 7.0.50 in production and it is doing well and I
don't want to migrate to jetty just for long polling (implemented using
AsyncResponse).

Any suggestions ??

Regards
Anurag
 On Aug 22, 2014 2:10 PM, Mark Thomas ma...@apache.org wrote:

 On 22/08/2014 06:03, anurag gupta wrote:
 
 
  Hi All,
 
   I'm trying to implement long polling using the servlet 3.0 spec.
  Implementation wise it's done and works fine in tomcat. The problem
 occurs
  when it is under load, for eg. when we send just 100,000 requests we see
  weird behaviour like requests timeout before the defined timeout, Tomcat
  goes OOM because of GC overhead limit exceeding.

 The root cause of the OOM is most likely your application rather than
 Tomcat.

  I have tried this on 2 diff versions of tomcat (mentioned in subject).
 
  OS CentOS 6.5
  Process memory 10g both Xmx and Xms
 
  So I have a question, upto how many concurrent open(idle) connections can
  a tomcat instance handle ?

 As many as your operating system will allow. (Hint: It will be less than
 100k).

  How to achieve maximum idle connections ?

 Fix your application so it doesn't trigger an OOME.

 Tune your OS.

 Mark

 -
 To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
 For additional commands, e-mail: users-h...@tomcat.apache.org




Re: Long Polling : Tomcat 7.0.50 / 8.0.9

2014-08-22 Thread Mark Thomas
On 22/08/2014 09:47, anurag gupta wrote:
 Thanks Mark.
 
 The same application is running in a jetty9 server. And I ran a test for 5
 hours with 300,000 requests (moving window of 9mins) with 10g of heap.
 Jetty didn't crash with OOM. So I guess my application is not the source of
 OOM.

I disagree. I suspect configuration differences.

The only way to know for sure is if you use a profiler and find out
where you application is using memory.

 I'm currently using tomcat 7.0.50 in production and it is doing well and I
 don't want to migrate to jetty just for long polling (implemented using
 AsyncResponse).

Which connector?

 Any suggestions ??

How long does a request take to process? Exactly how many concurrent
requests are you trying to support?

Mark


 
 Regards
 Anurag
  On Aug 22, 2014 2:10 PM, Mark Thomas ma...@apache.org wrote:
 
 On 22/08/2014 06:03, anurag gupta wrote:


 Hi All,

  I'm trying to implement long polling using the servlet 3.0 spec.
 Implementation wise it's done and works fine in tomcat. The problem
 occurs
 when it is under load, for eg. when we send just 100,000 requests we see
 weird behaviour like requests timeout before the defined timeout, Tomcat
 goes OOM because of GC overhead limit exceeding.

 The root cause of the OOM is most likely your application rather than
 Tomcat.

 I have tried this on 2 diff versions of tomcat (mentioned in subject).

 OS CentOS 6.5
 Process memory 10g both Xmx and Xms

 So I have a question, upto how many concurrent open(idle) connections can
 a tomcat instance handle ?

 As many as your operating system will allow. (Hint: It will be less than
 100k).

 How to achieve maximum idle connections ?

 Fix your application so it doesn't trigger an OOME.

 Tune your OS.

 Mark

 -
 To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
 For additional commands, e-mail: users-h...@tomcat.apache.org


 


-
To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
For additional commands, e-mail: users-h...@tomcat.apache.org



Re: Long Polling : Tomcat 7.0.50 / 8.0.9

2014-08-22 Thread anurag gupta
Executors:-

 Executor name=tomcatThreadPool namePrefix=catalina-exec-
maxThreads=2048 minSpareThreads=1024 maxQueueSize=1
prestartminSpareThreads=true/

This is the connector config :-

Connector port=8080
protocol=org.apache.coyote.http11.Http11NioProtocol redirectPort=8443
acceptCount=10 maxConnections=-1
   acceptorThreadCount=5 executor=tomcatThreadPool
connectionLinger=-1 socket.soLingerOn=false socket.soLingerTime=0
   socket.soReuseAddress=true connectionTimeout=1000
socket.soTimeout=1000 keepAliveTimeout=0 maxKeepAliveRequests=1
   socket.soKeepAlive=false /

 The only way to know for sure is if you use a profiler and find out
where you application is using memory.

Yes we do cache the AsyncResponse objects till the timeout happens or some
response is generated.

 How long does a request take to process? Exactly how many concurrent
requests are you trying to support?
A long poll request has a timeout of 10 mins (in this test), but we want to
have it upto 60 mins if feasible.
We are trying to figure out the max achievable concurrent requests.






On Fri, Aug 22, 2014 at 2:36 PM, Mark Thomas ma...@apache.org wrote:

 On 22/08/2014 09:47, anurag gupta wrote:
  Thanks Mark.
 
  The same application is running in a jetty9 server. And I ran a test for
 5
  hours with 300,000 requests (moving window of 9mins) with 10g of heap.
  Jetty didn't crash with OOM. So I guess my application is not the source
 of
  OOM.

 I disagree. I suspect configuration differences.

 The only way to know for sure is if you use a profiler and find out
 where you application is using memory.

  I'm currently using tomcat 7.0.50 in production and it is doing well and
 I
  don't want to migrate to jetty just for long polling (implemented using
  AsyncResponse).

 Which connector?

  Any suggestions ??

 How long does a request take to process? Exactly how many concurrent
 requests are you trying to support?

 Mark


 
  Regards
  Anurag
   On Aug 22, 2014 2:10 PM, Mark Thomas ma...@apache.org wrote:
 
  On 22/08/2014 06:03, anurag gupta wrote:
 
 
  Hi All,
 
   I'm trying to implement long polling using the servlet 3.0 spec.
  Implementation wise it's done and works fine in tomcat. The problem
  occurs
  when it is under load, for eg. when we send just 100,000 requests we
 see
  weird behaviour like requests timeout before the defined timeout,
 Tomcat
  goes OOM because of GC overhead limit exceeding.
 
  The root cause of the OOM is most likely your application rather than
  Tomcat.
 
  I have tried this on 2 diff versions of tomcat (mentioned in subject).
 
  OS CentOS 6.5
  Process memory 10g both Xmx and Xms
 
  So I have a question, upto how many concurrent open(idle) connections
 can
  a tomcat instance handle ?
 
  As many as your operating system will allow. (Hint: It will be less than
  100k).
 
  How to achieve maximum idle connections ?
 
  Fix your application so it doesn't trigger an OOME.
 
  Tune your OS.
 
  Mark
 
  -
  To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
  For additional commands, e-mail: users-h...@tomcat.apache.org
 
 
 


 -
 To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
 For additional commands, e-mail: users-h...@tomcat.apache.org




-- 
Regards
Anurag


Re: Long Polling : Tomcat 7.0.50 / 8.0.9

2014-08-22 Thread Mark Thomas
On 22/08/2014 11:22, anurag gupta wrote:
 Executors:-
 
  Executor name=tomcatThreadPool namePrefix=catalina-exec-
 maxThreads=2048 minSpareThreads=1024 maxQueueSize=1
 prestartminSpareThreads=true/
 
 This is the connector config :-
 
 Connector port=8080
 protocol=org.apache.coyote.http11.Http11NioProtocol redirectPort=8443
 acceptCount=10 maxConnections=-1
acceptorThreadCount=5 executor=tomcatThreadPool
 connectionLinger=-1 socket.soLingerOn=false socket.soLingerTime=0
socket.soReuseAddress=true connectionTimeout=1000
 socket.soTimeout=1000 keepAliveTimeout=0 maxKeepAliveRequests=1
socket.soKeepAlive=false /
 
 The only way to know for sure is if you use a profiler and find out
 where you application is using memory.
 
 Yes we do cache the AsyncResponse objects till the timeout happens or some
 response is generated.
 
 How long does a request take to process? Exactly how many concurrent
 requests are you trying to support?
 A long poll request has a timeout of 10 mins (in this test), but we want to
 have it upto 60 mins if feasible.
 We are trying to figure out the max achievable concurrent requests.

Concurrent requests != concurrent connections.

Concurrent requests (i.e. where the server is actively doing something
with a connection) will be limited to 2048 with that configuration
(maximum number of available threads).

Concurrent connections will depend on you test environment. For a single
Tomcat HTTP connector, there is a hard limit of 64k connections per
client but you can use multiple clients (each with their own IP address)
to get around that. After that, you'll hit OS limits - that should be
around several hundred k.

Mark


-
To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
For additional commands, e-mail: users-h...@tomcat.apache.org



Re: Long Polling : Tomcat 7.0.50 / 8.0.9

2014-08-22 Thread anurag gupta
Ok, So the requests will be idle upto the long poll timeout if no response
is generated.

So in our test setup we have 60 clients and each makes 5000 requests.
 These 5000 requests are made at the same time and renewed(i.e. a new
request is made in a loop ) as soon as
the app server sends response (which in the worst case i.e no response was
available, will be a empty json)

​A few minutes back I tried with processorCache=50, but still
tomcat(8.0.9) logged OOM GC Overhead Limit Exceeded and on server around
70K sockets were open (from /proc/net/sockstat​) .





On Fri, Aug 22, 2014 at 5:03 PM, Mark Thomas ma...@apache.org wrote:

 On 22/08/2014 11:22, anurag gupta wrote:
  Executors:-
 
   Executor name=tomcatThreadPool namePrefix=catalina-exec-
  maxThreads=2048 minSpareThreads=1024 maxQueueSize=1
  prestartminSpareThreads=true/
 
  This is the connector config :-
 
  Connector port=8080
  protocol=org.apache.coyote.http11.Http11NioProtocol redirectPort=8443
  acceptCount=10 maxConnections=-1
 acceptorThreadCount=5 executor=tomcatThreadPool
  connectionLinger=-1 socket.soLingerOn=false socket.soLingerTime=0
 socket.soReuseAddress=true connectionTimeout=1000
  socket.soTimeout=1000 keepAliveTimeout=0 maxKeepAliveRequests=1
 socket.soKeepAlive=false /
 
  The only way to know for sure is if you use a profiler and find out
  where you application is using memory.
 
  Yes we do cache the AsyncResponse objects till the timeout happens or
 some
  response is generated.
 
  How long does a request take to process? Exactly how many concurrent
  requests are you trying to support?
  A long poll request has a timeout of 10 mins (in this test), but we want
 to
  have it upto 60 mins if feasible.
  We are trying to figure out the max achievable concurrent requests.

 Concurrent requests != concurrent connections.

 Concurrent requests (i.e. where the server is actively doing something
 with a connection) will be limited to 2048 with that configuration
 (maximum number of available threads).

 Concurrent connections will depend on you test environment. For a single
 Tomcat HTTP connector, there is a hard limit of 64k connections per
 client but you can use multiple clients (each with their own IP address)
 to get around that. After that, you'll hit OS limits - that should be
 around several hundred k.

 Mark


 -
 To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
 For additional commands, e-mail: users-h...@tomcat.apache.org




-- 
Regards
Anurag


Re: Long Polling : Tomcat 7.0.50 / 8.0.9

2014-08-21 Thread anurag gupta


 Hi All,

  I'm trying to implement long polling using the servlet 3.0 spec.
 Implementation wise it's done and works fine in tomcat. The problem occurs
 when it is under load, for eg. when we send just 100,000 requests we see
 weird behaviour like requests timeout before the defined timeout, Tomcat
 goes OOM because of GC overhead limit exceeding.

 I have tried this on 2 diff versions of tomcat (mentioned in subject).

 OS CentOS 6.5
 Process memory 10g both Xmx and Xms

 So I have a question, upto how many concurrent open(idle) connections can
 a tomcat instance handle ? How to achieve maximum idle connections ?



-- 
Regards
Anurag