At a glance, this looks like you're seeing autowarming, is there any process that could be indexing documents when you see this?
Here's a quick test... set all your autowarm counts in solrconfig to 0 and if the problem goes away that's a smoking gun. Or look through your Solr logs around the time you see this, you should see some indication that a commit has happened and/or autowarming is going on. Best, Erick On Thu, Jan 16, 2014 at 2:57 AM, YouPeng Yang <yypvsxf19870...@gmail.com> wrote: > Hi > By the ways,after I restart the web container ,the ratio returns normal. > So when does the sutuation come out? > > > > Regards > > > > 2014/1/16 YouPeng Yang <yypvsxf19870...@gmail.com> > >> Hi >> Thanks for the reply. >> I get the information as following: >> ---------------------------------------------------- >> [solr@fkapp1 ~]$ ps mp 13359 -o THREAD,tid >> USER %CPU PRI SCNT WCHAN USER SYSTEM TID >> solr 217 - - - - - - >> solr 0.0 21 - 184466 - - 13359 >> solr 0.0 19 - - - - 13360 >> solr 0.0 23 - 184466 - - 13361 >> ..... >> solr 99.9 14 - - - - 1210 >> solr 99.9 14 - - - - 1223 >> solr 99.9 14 - - - - 1227 >> solr 99.9 14 - - - - 1228 >> ---------------------------------------------------------- >> Definitely, the suspicious threads are : 1210 1223 1227 1228, hexadecimal >> values are 0x4ba,0x4c7,0x4cb,0x4cc, >> >> And then get the thread info about the threads by the jstack tools. >> ---------------------------------------------------------- >> jstack -l 13359 > dump.stack >> >> ----------------------------------------------------------- >> >> Finally, I find the stack info about the above threads: >> >> I am not clear about the information, what does it mean? >> Is there anything abnormal with the SolrDispatchFilter? >> >> >> >> ----------------------------------------------------------- >> "http-bio-8081-exec-820" daemon prio=10 tid=0x00002aaac0d02800 nid=0x4cc >> runnable [0x0000000043c87000] >> java.lang.Thread.State: RUNNABLE >> at java.util.WeakHashMap.put(WeakHashMap.java:405) >> at >> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:350) >> at >> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:197) >> at >> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243) >> at >> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210) >> at >> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:222) >> at >> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:123) >> at >> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171) >> at >> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:99) >> at >> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:947) >> at >> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118) >> at >> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408) >> at >> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1009) >> at >> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:589) >> at >> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:310) >> - locked <0x0000000640604558> (a >> org.apache.tomcat.util.net.SocketWrapper) >> at >> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895) >> at >> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918) >> at java.lang.Thread.run(Thread.java:662) >> >> Locked ownable synchronizers: >> - <0x000000064061a360> (a >> java.util.concurrent.locks.ReentrantLock$NonfairSync) >> >> "http-bio-8081-exec-802" daemon prio=10 tid=0x00002aaac0a03000 nid=0x4ba >> runnable [0x0000000047dc8000] >> java.lang.Thread.State: RUNNABLE >> at java.util.WeakHashMap.get(WeakHashMap.java:355) >> at >> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:347) >> at >> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:197) >> at >> org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243) >> at >> org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210) >> at >> org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:222) >> at >> org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:123) >> at >> org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171) >> at >> org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:99) >> at >> org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:947) >> at >> org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118) >> at >> org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408) >> at >> org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1009) >> at >> org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:589) >> at >> org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:312) >> - locked <0x0000000640605288> (a >> org.apache.tomcat.util.net.SocketWrapper) >> at >> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895) >> at >> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918) >> at java.lang.Thread.run(Thread.java:662) >> >> Locked ownable synchronizers: >> - <0x0000000640605308> (a >> java.util.concurrent.locks.ReentrantLock$NonfairSync) >> >> ...... >> ----------------------------------------------------------- >> >> >> >> >> 2014/1/16 Otis Gospodnetic <otis.gospodne...@gmail.com> >> >>> I wonder if this would be a good addition to Solr Admin functionality? >>> >>> Otis >>> -- >>> Performance Monitoring * Log Analytics * Search Analytics >>> Solr & Elasticsearch Support * http://sematext.com/ >>> >>> >>> On Wed, Jan 15, 2014 at 6:29 AM, Mikhail Khludnev < >>> mkhlud...@griddynamics.com> wrote: >>> >>> > Hello, >>> > >>> > Invoke top for particular process displaying threads enabled. >>> > Find the hottest thread PID. >>> > invoke jstack for this process, find the suspicious thread by ".. >>> > nid=0x[PID in hex]" >>> > ... >>> > PROFIT! >>> > >>> > >>> > On Wed, Jan 15, 2014 at 1:38 PM, YouPeng Yang < >>> yypvsxf19870...@gmail.com >>> > >wrote: >>> > >>> > > Hi >>> > > I find that the cpu ratio is very high when the tomcat contained >>> solr >>> > > 4.6 sleep. >>> > > The pid 13359 shows that my sleeping solr web container take high >>> cpu >>> > > ratio >>> > > >>> > > Any insights? >>> > > >>> > > >>> > > [solr@fkapp1 ~]$ top -d -1 -u solr >>> > > top - 17:30:15 up 302 days, 7:10, 5 users, load average: 4.54, >>> 4.52, >>> > > 4.47 >>> > > Tasks: 418 total, 1 running, 412 sleeping, 0 stopped, 5 zombie >>> > > Cpu(s): 19.1%us, 0.1%sy, 0.0%ni, 80.8%id, 0.0%wa, 0.0%hi, 0.0%si, >>> > > 0.0%st >>> > > Mem: 32955380k total, 28288212k used, 4667168k free, 503148k >>> buffers >>> > > Swap: 37257200k total, 87064k used, 37170136k free, 10861500k >>> cached >>> > > >>> > > PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ >>> > > COMMAND >>> > > >>> > > 13359 solr 21 0 11.4g 6.7g 12m S 400.5 21.4 491:30.85 java >>> > > >>> > > 3678 solr 15 0 13020 1380 828 R 0.0 0.0 0:19.16 >>> > > top >>> > > >>> > > 3694 solr 15 0 66092 1556 1228 S 0.0 0.0 0:00.01 >>> > > bash >>> > > >>> > >>> > >>> > >>> > -- >>> > Sincerely yours >>> > Mikhail Khludnev >>> > Principal Engineer, >>> > Grid Dynamics >>> > >>> > <http://www.griddynamics.com> >>> > <mkhlud...@griddynamics.com> >>> > >>> >> >>