Executors:-

 <Executor name="tomcatThreadPool" namePrefix="catalina-exec-"
maxThreads="2048" minSpareThreads="1024" maxQueueSize="10000"
prestartminSpareThreads="true"/>

This is the connector config :-

<Connector port="8080"
protocol="org.apache.coyote.http11.Http11NioProtocol" redirectPort="8443"
acceptCount="100000" maxConnections="-1"
                   acceptorThreadCount="5" executor="tomcatThreadPool"
connectionLinger="-1" socket.soLingerOn="false" socket.soLingerTime="0"
                   socket.soReuseAddress="true" connectionTimeout="1000"
socket.soTimeout="1000" keepAliveTimeout="0" maxKeepAliveRequests="1"
                   socket.soKeepAlive="false" />

> The only way to know for sure is if you use a profiler and find out
where you application is using memory.

Yes we do cache the AsyncResponse objects till the timeout happens or some
response is generated.

> How long does a request take to process? Exactly how many concurrent
requests are you trying to support?
A long poll request has a timeout of 10 mins (in this test), but we want to
have it upto 60 mins if feasible.
We are trying to figure out the max achievable concurrent requests.






On Fri, Aug 22, 2014 at 2:36 PM, Mark Thomas <ma...@apache.org> wrote:

> On 22/08/2014 09:47, anurag gupta wrote:
> > Thanks Mark.
> >
> > The same application is running in a jetty9 server. And I ran a test for
> 5
> > hours with 300,000 requests (moving window of 9mins) with 10g of heap.
> > Jetty didn't crash with OOM. So I guess my application is not the source
> of
> > OOM.
>
> I disagree. I suspect configuration differences.
>
> The only way to know for sure is if you use a profiler and find out
> where you application is using memory.
>
> > I'm currently using tomcat 7.0.50 in production and it is doing well and
> I
> > don't want to migrate to jetty just for long polling (implemented using
> > AsyncResponse).
>
> Which connector?
>
> > Any suggestions ??
>
> How long does a request take to process? Exactly how many concurrent
> requests are you trying to support?
>
> Mark
>
>
> >
> > Regards
> > Anurag
> >  On Aug 22, 2014 2:10 PM, "Mark Thomas" <ma...@apache.org> wrote:
> >
> >> On 22/08/2014 06:03, anurag gupta wrote:
> >>>
> >>>
> >>> Hi All,
> >>>
> >>>  I'm trying to implement long polling using the servlet 3.0 spec.
> >>> Implementation wise it's done and works fine in tomcat. The problem
> >> occurs
> >>> when it is under load, for eg. when we send just 100,000 requests we
> see
> >>> weird behaviour like requests timeout before the defined timeout,
> Tomcat
> >>> goes OOM because of GC overhead limit exceeding.
> >>
> >> The root cause of the OOM is most likely your application rather than
> >> Tomcat.
> >>
> >>> I have tried this on 2 diff versions of tomcat (mentioned in subject).
> >>>
> >>> OS CentOS 6.5
> >>> Process memory 10g both Xmx and Xms
> >>>
> >>> So I have a question, upto how many concurrent open(idle) connections
> can
> >>> a tomcat instance handle ?
> >>
> >> As many as your operating system will allow. (Hint: It will be less than
> >> 100k).
> >>
> >>> How to achieve maximum idle connections ?
> >>
> >> Fix your application so it doesn't trigger an OOME.
> >>
> >> Tune your OS.
> >>
> >> Mark
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
> >> For additional commands, e-mail: users-h...@tomcat.apache.org
> >>
> >>
> >
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
> For additional commands, e-mail: users-h...@tomcat.apache.org
>
>


-- 
Regards
Anurag

Reply via email to