Re: Solr Out of Memory Error
Dear Adam, I also got the OutOfMemory exception. I changed the JAVA_OPTS in catalina.sh as follows. ... if [ -z $LOGGING_MANAGER ]; then JAVA_OPTS=$JAVA_OPTS -Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager else JAVA_OPTS=$JAVA_OPTS -server -Xms8096m -Xmx8096m fi ... Is this change correct? After that, I still got the same exception. The index is updated and searched frequently. I am trying to change the code to avoid the frequent updates. I guess only changing JAVA_OPTS does not work. Could you give me some help? Thanks, LB On Wed, Jan 19, 2011 at 10:05 PM, Adam Estrada estrada.adam.gro...@gmail.com wrote: Is anyone familiar with the environment variable, JAVA_OPTS? I set mine to a much larger heap size and never had any of these issues again. JAVA_OPTS = -server -Xms4048m -Xmx4048m Adam On Wed, Jan 19, 2011 at 3:29 AM, Isan Fulia isan.fu...@germinait.com wrote: Hi all, By adding more servers do u mean sharding of index.And after sharding , how my query performance will be affected . Will the query execution time increase. Thanks, Isan Fulia. On 19 January 2011 12:52, Grijesh pintu.grij...@gmail.com wrote: Hi Isan, It seems your index size 25GB si much more compared to you have total Ram size is 4GB. You have to do 2 things to avoid Out Of Memory Problem. 1-Buy more Ram ,add at least 12 GB of more ram. 2-Increase the Memory allocated to solr by setting XMX values.at least 12 GB allocate to solr. But if your all index will fit into the Cache memory it will give you the better result. Also add more servers to load balance as your QPS is high. Your 7 Laks data makes 25 GB of index its looking quite high.Try to lower the index size What are you indexing in your 25GB of index? - Thanx: Grijesh -- View this message in context: http://lucene.472066.n3.nabble.com/Solr-Out-of-Memory-Error-tp2280037p2285779.html Sent from the Solr - User mailing list archive at Nabble.com. -- Thanks Regards, Isan Fulia.
Re: Solr Out of Memory Error
Bing Li, One should be conservative when setting Xmx. Also, just setting Xmx might not do the trick at all because the garbage collector might also be the issue here. Configure the JVM to output debug logs of the garbage collector and monitor the heap usage (especially the tenured generation) with a good tool like JConsole. You might also want to take a look at your cache settings and autowarm parameters. In some scenario's with very frequent updates, a large corpus and a high load of heterogenous queries you might want to dump the documentCache and queryResultCache, the cache hitratio tends to be very low and the caches will just consume a lot of memory and CPU time. One of my projects i finally decided to only use the filterCache. Using the other caches took too much RAM and CPU while running and had a lot of evictions and still a lot hitratio. I could, of course, make the caches a lot bigger and increase autowarming but that would take a lot of time before a cache is autowarmed and a very, very, large amount of RAM. I choose to rely on the OS-cache instead. Cheers, Dear Adam, I also got the OutOfMemory exception. I changed the JAVA_OPTS in catalina.sh as follows. ... if [ -z $LOGGING_MANAGER ]; then JAVA_OPTS=$JAVA_OPTS -Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager else JAVA_OPTS=$JAVA_OPTS -server -Xms8096m -Xmx8096m fi ... Is this change correct? After that, I still got the same exception. The index is updated and searched frequently. I am trying to change the code to avoid the frequent updates. I guess only changing JAVA_OPTS does not work. Could you give me some help? Thanks, LB On Wed, Jan 19, 2011 at 10:05 PM, Adam Estrada estrada.adam.gro...@gmail.com wrote: Is anyone familiar with the environment variable, JAVA_OPTS? I set mine to a much larger heap size and never had any of these issues again. JAVA_OPTS = -server -Xms4048m -Xmx4048m Adam On Wed, Jan 19, 2011 at 3:29 AM, Isan Fulia isan.fu...@germinait.com wrote: Hi all, By adding more servers do u mean sharding of index.And after sharding , how my query performance will be affected . Will the query execution time increase. Thanks, Isan Fulia. On 19 January 2011 12:52, Grijesh pintu.grij...@gmail.com wrote: Hi Isan, It seems your index size 25GB si much more compared to you have total Ram size is 4GB. You have to do 2 things to avoid Out Of Memory Problem. 1-Buy more Ram ,add at least 12 GB of more ram. 2-Increase the Memory allocated to solr by setting XMX values.at least 12 GB allocate to solr. But if your all index will fit into the Cache memory it will give you the better result. Also add more servers to load balance as your QPS is high. Your 7 Laks data makes 25 GB of index its looking quite high.Try to lower the index size What are you indexing in your 25GB of index? - Thanx: Grijesh -- View this message in context: http://lucene.472066.n3.nabble.com/Solr-Out-of-Memory-Error-tp2280037p228 5779.html Sent from the Solr - User mailing list archive at Nabble.com. -- Thanks Regards, Isan Fulia.
Re: Solr Out of Memory Error
I should also add that reducing the caches and autowarm sizes (or not using them at all) drastically reduces memory consumption when a new searcher is being prepares after a commit. The memory usage will spike at these events. Again, use a monitoring tool to get more information on your specific scenario. Bing Li, One should be conservative when setting Xmx. Also, just setting Xmx might not do the trick at all because the garbage collector might also be the issue here. Configure the JVM to output debug logs of the garbage collector and monitor the heap usage (especially the tenured generation) with a good tool like JConsole. You might also want to take a look at your cache settings and autowarm parameters. In some scenario's with very frequent updates, a large corpus and a high load of heterogenous queries you might want to dump the documentCache and queryResultCache, the cache hitratio tends to be very low and the caches will just consume a lot of memory and CPU time. One of my projects i finally decided to only use the filterCache. Using the other caches took too much RAM and CPU while running and had a lot of evictions and still a lot hitratio. I could, of course, make the caches a lot bigger and increase autowarming but that would take a lot of time before a cache is autowarmed and a very, very, large amount of RAM. I choose to rely on the OS-cache instead. Cheers, Dear Adam, I also got the OutOfMemory exception. I changed the JAVA_OPTS in catalina.sh as follows. ... if [ -z $LOGGING_MANAGER ]; then JAVA_OPTS=$JAVA_OPTS -Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager else JAVA_OPTS=$JAVA_OPTS -server -Xms8096m -Xmx8096m fi ... Is this change correct? After that, I still got the same exception. The index is updated and searched frequently. I am trying to change the code to avoid the frequent updates. I guess only changing JAVA_OPTS does not work. Could you give me some help? Thanks, LB On Wed, Jan 19, 2011 at 10:05 PM, Adam Estrada estrada.adam.gro...@gmail.com wrote: Is anyone familiar with the environment variable, JAVA_OPTS? I set mine to a much larger heap size and never had any of these issues again. JAVA_OPTS = -server -Xms4048m -Xmx4048m Adam On Wed, Jan 19, 2011 at 3:29 AM, Isan Fulia isan.fu...@germinait.com wrote: Hi all, By adding more servers do u mean sharding of index.And after sharding , how my query performance will be affected . Will the query execution time increase. Thanks, Isan Fulia. On 19 January 2011 12:52, Grijesh pintu.grij...@gmail.com wrote: Hi Isan, It seems your index size 25GB si much more compared to you have total Ram size is 4GB. You have to do 2 things to avoid Out Of Memory Problem. 1-Buy more Ram ,add at least 12 GB of more ram. 2-Increase the Memory allocated to solr by setting XMX values.at least 12 GB allocate to solr. But if your all index will fit into the Cache memory it will give you the better result. Also add more servers to load balance as your QPS is high. Your 7 Laks data makes 25 GB of index its looking quite high.Try to lower the index size What are you indexing in your 25GB of index? - Thanx: Grijesh -- View this message in context: http://lucene.472066.n3.nabble.com/Solr-Out-of-Memory-Error-tp2280037p2 28 5779.html Sent from the Solr - User mailing list archive at Nabble.com. -- Thanks Regards, Isan Fulia.
Re: Solr Out of Memory Error
Hi all, By adding more servers do u mean sharding of index.And after sharding , how my query performance will be affected . Will the query execution time increase. Thanks, Isan Fulia. On 19 January 2011 12:52, Grijesh pintu.grij...@gmail.com wrote: Hi Isan, It seems your index size 25GB si much more compared to you have total Ram size is 4GB. You have to do 2 things to avoid Out Of Memory Problem. 1-Buy more Ram ,add at least 12 GB of more ram. 2-Increase the Memory allocated to solr by setting XMX values.at least 12 GB allocate to solr. But if your all index will fit into the Cache memory it will give you the better result. Also add more servers to load balance as your QPS is high. Your 7 Laks data makes 25 GB of index its looking quite high.Try to lower the index size What are you indexing in your 25GB of index? - Thanx: Grijesh -- View this message in context: http://lucene.472066.n3.nabble.com/Solr-Out-of-Memory-Error-tp2280037p2285779.html Sent from the Solr - User mailing list archive at Nabble.com. -- Thanks Regards, Isan Fulia.
Re: Solr Out of Memory Error
Is anyone familiar with the environment variable, JAVA_OPTS? I set mine to a much larger heap size and never had any of these issues again. JAVA_OPTS = -server -Xms4048m -Xmx4048m Adam On Wed, Jan 19, 2011 at 3:29 AM, Isan Fulia isan.fu...@germinait.com wrote: Hi all, By adding more servers do u mean sharding of index.And after sharding , how my query performance will be affected . Will the query execution time increase. Thanks, Isan Fulia. On 19 January 2011 12:52, Grijesh pintu.grij...@gmail.com wrote: Hi Isan, It seems your index size 25GB si much more compared to you have total Ram size is 4GB. You have to do 2 things to avoid Out Of Memory Problem. 1-Buy more Ram ,add at least 12 GB of more ram. 2-Increase the Memory allocated to solr by setting XMX values.at least 12 GB allocate to solr. But if your all index will fit into the Cache memory it will give you the better result. Also add more servers to load balance as your QPS is high. Your 7 Laks data makes 25 GB of index its looking quite high.Try to lower the index size What are you indexing in your 25GB of index? - Thanx: Grijesh -- View this message in context: http://lucene.472066.n3.nabble.com/Solr-Out-of-Memory-Error-tp2280037p2285779.html Sent from the Solr - User mailing list archive at Nabble.com. -- Thanks Regards, Isan Fulia.
Re: Solr Out of Memory Error
By adding more server means add more searchers (slaves) on Load balancer not talking about sharding. Sharding is required when your index size will increase the size of about 50GB. - Thanx: Grijesh -- View this message in context: http://lucene.472066.n3.nabble.com/Solr-Out-of-Memory-Error-tp2280037p2292944.html Sent from the Solr - User mailing list archive at Nabble.com.
Solr Out of Memory Error
Hi all, I got the following error on solr with m/c configuration 4GB RAM and Intel Dual Core Processor.Can you please help me out. java.lang.OutOfMemoryError: Java heap space 2011-01-18 18:00:27.655:WARN::Committed before 500 OutOfMemoryError likely caused by the Sun VM Bug described in https://issues.apache.org/jira/browse/LUCENE-1566; try calling FSDirectory.setReadChunkSize with a a value smaller than the current chunk size (2147483647)||java.lang. OutOfMemoryError: OutOfMemoryError likely caused by the Sun VM Bug described in https://issues.apache.org/jira/browse/LUCENE-1566; try calling FSDirectory.setReadChunkSize with a a value smaller than the current chunk size (2147483647)|?at org.apache.lucene.store.NIOFSDirectory$NIOFSIndexInput.readInternal(NIOFSDirectory.java:161)|?at org.apache.lucene.store.BufferedIndexInput.readBytes(BufferedIndexInput.java:139)|?at org.apache.lucene.index.CompoundFileReader$CSIndexInput.readInternal(CompoundFileReader.java:285)|?at org.apache.lucene.store.BufferedIndexInput.refill(BufferedIndexInput.java:160)|?at org.apache.lucene.store.BufferedIndexInput.readByte(BufferedIndexInput.java:39)|?at org.apache.lucene.store.DataInput.readVInt(DataInput.java:86)|?at org.apache.lucene.index.FieldsReader.doc(FieldsReader.java:201)|?at org.apache.lucene.index.SegmentReader.document(SegmentReader.java:828)|?at org.apache.lucene.index.DirectoryReader.document(DirectoryReader.java:579)|?at org.apache.lucene.index.IndexReader.document(IndexReader.java:755)|?at org.apache.solr.search.SolrIndexReader.document(SolrIndexReader.java:454)|?at org.apache.solr.search.SolrIndexSearcher.doc(SolrIndexSearcher.java:431)|?at org.apache.solr.response.BinaryResponseWriter$Resolver.writeDocList(BinaryResponseWriter.java:120)|?at org.apache.solr.response.BinaryResponseWriter$Resolver.resolve(BinaryResponseWriter.java:86)|?at org.apache.solr.common.util.JavaBinCodec.writeVal(JavaBinCodec.java:143)|?at org.apache.solr.common.util.JavaBinCodec.writeNamedList(JavaBinCodec.java:133)|?at org.apache.solr.common.util.JavaBinCodec.writeKnownType(JavaBinCodec.java:221)|?at org.apache.solr.common.util.JavaBinCodec.writeVal(JavaBinCodec.java:138)|?at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:87)|?at org.apache.solr.response.BinaryResponseWriter.write(BinaryResponseWriter.java:46)|?at org.apache.solr.servlet.SolrDispatchFilter.writeResponse(SolrDispatchFilter.java:321)|?at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:253)|?at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1157)|?at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:388)|?at org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216)|?at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)|?at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:765)|?at org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:418)|?at org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230)|?at org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java:114)|?at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)|?at org.mortbay.jetty.Server.handle(Server.java:326)|?at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)|?at org.mortbay.jetty.HttpConnection$RequestHandler.content(HttpConnection.java:938)|?at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:755)|?at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:218)|?at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)|?at org.mortbay.jetty.bio.SocketConnector$Connection.run(SocketConnector.java:228)|?at org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:582)|Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded| 2011-01-18 18:00:27.656:WARN::/solr/ProdContentIndex/select java.lang.IllegalStateException: Committed at org.mortbay.jetty.Response.resetBuffer(Response.java:1024) at org.mortbay.jetty.Response.sendError(Response.java:240) at org.apache.solr.servlet.SolrDispatchFilter.sendError(SolrDispatchFilter.java:361) at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:271) at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1157) at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:388) at org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216) at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182) at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:765) at org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:418) at org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:230) at
Re: Solr Out of Memory Error
Hi I haven't seen one like this before. Please provide JVM settings and Solr version. Cheers On Tuesday 18 January 2011 15:08:35 Isan Fulia wrote: Hi all, I got the following error on solr with m/c configuration 4GB RAM and Intel Dual Core Processor.Can you please help me out. java.lang.OutOfMemoryError: Java heap space 2011-01-18 18:00:27.655:WARN::Committed before 500 OutOfMemoryError likely caused by the Sun VM Bug described in https://issues.apache.org/jira/browse/LUCENE-1566; try calling FSDirectory.setReadChunkSize with a a value smaller than the current chunk size (2147483647)||java.lang. OutOfMemoryError: OutOfMemoryError likely caused by the Sun VM Bug described in https://issues.apache.org/jira/browse/LUCENE-1566; try calling FSDirectory.setReadChunkSize with a a value smaller than the current chunk size (2147483647)|?at org.apache.lucene.store.NIOFSDirectory$NIOFSIndexInput.readInternal(NIOFSDi rectory.java:161)|?at org.apache.lucene.store.BufferedIndexInput.readBytes(BufferedIndexInput.ja va:139)|?at org.apache.lucene.index.CompoundFileReader$CSIndexInput.readInternal(Compo undFileReader.java:285)|?at org.apache.lucene.store.BufferedIndexInput.refill(BufferedIndexInput.java: 160)|?at org.apache.lucene.store.BufferedIndexInput.readByte(BufferedIndexInput.jav a:39)|?at org.apache.lucene.store.DataInput.readVInt(DataInput.java:86)|?at org.apache.lucene.index.FieldsReader.doc(FieldsReader.java:201)|?at org.apache.lucene.index.SegmentReader.document(SegmentReader.java:828)|?at org.apache.lucene.index.DirectoryReader.document(DirectoryReader.java:579)| ?at org.apache.lucene.index.IndexReader.document(IndexReader.java:755)|?at org.apache.solr.search.SolrIndexReader.document(SolrIndexReader.java:454)| ?at org.apache.solr.search.SolrIndexSearcher.doc(SolrIndexSearcher.java:431)|? at org.apache.solr.response.BinaryResponseWriter$Resolver.writeDocList(Binary ResponseWriter.java:120)|?at org.apache.solr.response.BinaryResponseWriter$Resolver.resolve(BinaryRespo nseWriter.java:86)|?at org.apache.solr.common.util.JavaBinCodec.writeVal(JavaBinCodec.java:143)|? at org.apache.solr.common.util.JavaBinCodec.writeNamedList(JavaBinCodec.java: 133)|?at org.apache.solr.common.util.JavaBinCodec.writeKnownType(JavaBinCodec.java: 221)|?at org.apache.solr.common.util.JavaBinCodec.writeVal(JavaBinCodec.java:138)|? at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:87)|?at org.apache.solr.response.BinaryResponseWriter.write(BinaryResponseWriter.j ava:46)|?at org.apache.solr.servlet.SolrDispatchFilter.writeResponse(SolrDispatchFilte r.java:321)|?at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.jav a:253)|?at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandl er.java:1157)|?at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:388)|? at org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216 )|?at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)|? at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:765)|? at org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:418)|?at org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCo llection.java:230)|?at org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java: 114)|?at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)|? at org.mortbay.jetty.Server.handle(Server.java:326)|?at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)|?at org.mortbay.jetty.HttpConnection$RequestHandler.content(HttpConnection.java :938)|?at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:755)|?at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:218)|?at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)|?at org.mortbay.jetty.bio.SocketConnector$Connection.run(SocketConnector.java:2 28)|?at org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:5 82)|Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded| 2011-01-18 18:00:27.656:WARN::/solr/ProdContentIndex/select java.lang.IllegalStateException: Committed at org.mortbay.jetty.Response.resetBuffer(Response.java:1024) at org.mortbay.jetty.Response.sendError(Response.java:240) at org.apache.solr.servlet.SolrDispatchFilter.sendError(SolrDispatchFilter.jav a:361) at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java :271) at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandle r.java:1157) at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:388) at org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216) at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182) at
Re: Solr Out of Memory Error
Hi markus, We dont have any XMX memory settings as such .Our java version is 1.6.0_19 and solr version is 1.4 developer version. Can u plz help us out. Thanks, Isan. On 18 January 2011 19:54, Markus Jelsma markus.jel...@openindex.io wrote: Hi I haven't seen one like this before. Please provide JVM settings and Solr version. Cheers On Tuesday 18 January 2011 15:08:35 Isan Fulia wrote: Hi all, I got the following error on solr with m/c configuration 4GB RAM and Intel Dual Core Processor.Can you please help me out. java.lang.OutOfMemoryError: Java heap space 2011-01-18 18:00:27.655:WARN::Committed before 500 OutOfMemoryError likely caused by the Sun VM Bug described in https://issues.apache.org/jira/browse/LUCENE-1566; try calling FSDirectory.setReadChunkSize with a a value smaller than the current chunk size (2147483647)||java.lang. OutOfMemoryError: OutOfMemoryError likely caused by the Sun VM Bug described in https://issues.apache.org/jira/browse/LUCENE-1566; try calling FSDirectory.setReadChunkSize with a a value smaller than the current chunk size (2147483647)|?at org.apache.lucene.store.NIOFSDirectory$NIOFSIndexInput.readInternal(NIOFSDi rectory.java:161)|?at org.apache.lucene.store.BufferedIndexInput.readBytes(BufferedIndexInput.ja va:139)|?at org.apache.lucene.index.CompoundFileReader$CSIndexInput.readInternal(Compo undFileReader.java:285)|?at org.apache.lucene.store.BufferedIndexInput.refill(BufferedIndexInput.java: 160)|?at org.apache.lucene.store.BufferedIndexInput.readByte(BufferedIndexInput.jav a:39)|?at org.apache.lucene.store.DataInput.readVInt(DataInput.java:86)|?at org.apache.lucene.index.FieldsReader.doc(FieldsReader.java:201)|?at org.apache.lucene.index.SegmentReader.document(SegmentReader.java:828)|?at org.apache.lucene.index.DirectoryReader.document(DirectoryReader.java:579)| ?at org.apache.lucene.index.IndexReader.document(IndexReader.java:755)|?at org.apache.solr.search.SolrIndexReader.document(SolrIndexReader.java:454)| ?at org.apache.solr.search.SolrIndexSearcher.doc(SolrIndexSearcher.java:431)|? at org.apache.solr.response.BinaryResponseWriter$Resolver.writeDocList(Binary ResponseWriter.java:120)|?at org.apache.solr.response.BinaryResponseWriter$Resolver.resolve(BinaryRespo nseWriter.java:86)|?at org.apache.solr.common.util.JavaBinCodec.writeVal(JavaBinCodec.java:143)|? at org.apache.solr.common.util.JavaBinCodec.writeNamedList(JavaBinCodec.java: 133)|?at org.apache.solr.common.util.JavaBinCodec.writeKnownType(JavaBinCodec.java: 221)|?at org.apache.solr.common.util.JavaBinCodec.writeVal(JavaBinCodec.java:138)|? at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:87)|?at org.apache.solr.response.BinaryResponseWriter.write(BinaryResponseWriter.j ava:46)|?at org.apache.solr.servlet.SolrDispatchFilter.writeResponse(SolrDispatchFilte r.java:321)|?at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.jav a:253)|?at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandl er.java:1157)|?at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:388)|? at org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216 )|?at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)|? at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:765)|? at org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:418)|?at org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCo llection.java:230)|?at org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java: 114)|?at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)|? at org.mortbay.jetty.Server.handle(Server.java:326)|?at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)|?at org.mortbay.jetty.HttpConnection$RequestHandler.content(HttpConnection.java :938)|?at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:755)|?at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:218)|?at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)|?at org.mortbay.jetty.bio.SocketConnector$Connection.run(SocketConnector.java:2 28)|?at org.mortbay.thread.QueuedThreadPool$PoolThread.run(QueuedThreadPool.java:5 82)|Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded| 2011-01-18 18:00:27.656:WARN::/solr/ProdContentIndex/select java.lang.IllegalStateException: Committed at org.mortbay.jetty.Response.resetBuffer(Response.java:1024) at org.mortbay.jetty.Response.sendError(Response.java:240) at org.apache.solr.servlet.SolrDispatchFilter.sendError(SolrDispatchFilter.jav a:361) at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java :271) at
Re: Solr Out of Memory Error
I've run into that GC overhead limit exceeded messagebefore. Try: -XX:-UseGCOverheadLimit to avoid that one; but it basically means you're running low on memory and turning off GCOverheadLimit may just lead to a real heap space a few seconds later. Also, if you have free memory on the host, try using the -Xmx command line parameter to raise the max amount of memory solr can use. (-Xmx2g for example) - Original Message From: Isan Fulia isan.fu...@germinait.com To: markus.jel...@openindex.io Cc: solr-user@lucene.apache.org Sent: Tue, January 18, 2011 9:04:31 PM Subject: Re: Solr Out of Memory Error Hi markus, We dont have any XMX memory settings as such .Our java version is 1.6.0_19 and solr version is 1.4 developer version. Can u plz help us out. Thanks, Isan. On 18 January 2011 19:54, Markus Jelsma markus.jel...@openindex.io wrote: Hi I haven't seen one like this before. Please provide JVM settings and Solr version. Cheers On Tuesday 18 January 2011 15:08:35 Isan Fulia wrote: Hi all, I got the following error on solr with m/c configuration 4GB RAM and Intel Dual Core Processor.Can you please help me out. java.lang.OutOfMemoryError: Java heap space 2011-01-18 18:00:27.655:WARN::Committed before 500 OutOfMemoryError likely caused by the Sun VM Bug described in https://issues.apache.org/jira/browse/LUCENE-1566; try calling FSDirectory.setReadChunkSize with a a value smaller than the current chunk size (2147483647)||java.lang. OutOfMemoryError: OutOfMemoryError likely caused by the Sun VM Bug described in https://issues.apache.org/jira/browse/LUCENE-1566; try calling FSDirectory.setReadChunkSize with a a value smaller than the current chunk size (2147483647)|?at org.apache.lucene.store.NIOFSDirectory$NIOFSIndexInput.readInternal(NIOFSDi rectory.java:161)|?at org.apache.lucene.store.BufferedIndexInput.readBytes(BufferedIndexInput.ja va:139)|?at org.apache.lucene.index.CompoundFileReader$CSIndexInput.readInternal(Compo undFileReader.java:285)|?at org.apache.lucene.store.BufferedIndexInput.refill(BufferedIndexInput.java: 160)|?at org.apache.lucene.store.BufferedIndexInput.readByte(BufferedIndexInput.jav a:39)|?at org.apache.lucene.store.DataInput.readVInt(DataInput.java:86)|?at org.apache.lucene.index.FieldsReader.doc(FieldsReader.java:201)|?at org.apache.lucene.index.SegmentReader.document(SegmentReader.java:828)|?at org.apache.lucene.index.DirectoryReader.document(DirectoryReader.java:579)| ?at org.apache.lucene.index.IndexReader.document(IndexReader.java:755)|?at org.apache.solr.search.SolrIndexReader.document(SolrIndexReader.java:454)| ?at org.apache.solr.search.SolrIndexSearcher.doc(SolrIndexSearcher.java:431)|? at org.apache.solr.response.BinaryResponseWriter$Resolver.writeDocList(Binary ResponseWriter.java:120)|?at org.apache.solr.response.BinaryResponseWriter$Resolver.resolve(BinaryRespo nseWriter.java:86)|?at org.apache.solr.common.util.JavaBinCodec.writeVal(JavaBinCodec.java:143)|? at org.apache.solr.common.util.JavaBinCodec.writeNamedList(JavaBinCodec.java: 133)|?at org.apache.solr.common.util.JavaBinCodec.writeKnownType(JavaBinCodec.java: 221)|?at org.apache.solr.common.util.JavaBinCodec.writeVal(JavaBinCodec.java:138)|? at org.apache.solr.common.util.JavaBinCodec.marshal(JavaBinCodec.java:87)|?at org.apache.solr.response.BinaryResponseWriter.write(BinaryResponseWriter.j ava:46)|?at org.apache.solr.servlet.SolrDispatchFilter.writeResponse(SolrDispatchFilte r.java:321)|?at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.jav a:253)|?at org.mortbay.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandl er.java:1157)|?at org.mortbay.jetty.servlet.ServletHandler.handle(ServletHandler.java:388)|? at org.mortbay.jetty.security.SecurityHandler.handle(SecurityHandler.java:216 )|?at org.mortbay.jetty.servlet.SessionHandler.handle(SessionHandler.java:182)|? at org.mortbay.jetty.handler.ContextHandler.handle(ContextHandler.java:765)|? at org.mortbay.jetty.webapp.WebAppContext.handle(WebAppContext.java:418)|?at org.mortbay.jetty.handler.ContextHandlerCollection.handle(ContextHandlerCo llection.java:230)|?at org.mortbay.jetty.handler.HandlerCollection.handle(HandlerCollection.java: 114)|?at org.mortbay.jetty.handler.HandlerWrapper.handle(HandlerWrapper.java:152)|? at org.mortbay.jetty.Server.handle(Server.java:326)|?at org.mortbay.jetty.HttpConnection.handleRequest(HttpConnection.java:542)|?at org.mortbay.jetty.HttpConnection$RequestHandler.content(HttpConnection.java :938)|?at org.mortbay.jetty.HttpParser.parseNext(HttpParser.java:755)|?at org.mortbay.jetty.HttpParser.parseAvailable(HttpParser.java:218)|?at org.mortbay.jetty.HttpConnection.handle(HttpConnection.java:404)|?at org.mortbay.jetty.bio.SocketConnector$Connection.run(SocketConnector.java:2 28
Re: Solr Out of Memory Error
On which server [master/slave] Out of Memory ocuur What is your index in size[GB]? How many documents you have? What is query per second? How you are indexing? What is you ramBufferSize? - Thanx: Grijesh -- View this message in context: http://lucene.472066.n3.nabble.com/Solr-Out-of-Memory-Error-tp2280037p2285392.html Sent from the Solr - User mailing list archive at Nabble.com.
Re: Solr Out of Memory Error
Hi Grijesh,all, We are having only single master and are using multicore environment with size of various indexes as 675MB ,516 MB , 3GB , 25GB. Number of documents with 3GB index are roughly around 14 lakhs and with 25 GB are roughly around 7 lakh Queries are fired very frequently. ramBufferSize and indexing are all default settings. Thanks , Isan. On 19 January 2011 10:41, Grijesh pintu.grij...@gmail.com wrote: On which server [master/slave] Out of Memory ocuur What is your index in size[GB]? How many documents you have? What is query per second? How you are indexing? What is you ramBufferSize? - Thanx: Grijesh -- View this message in context: http://lucene.472066.n3.nabble.com/Solr-Out-of-Memory-Error-tp2280037p2285392.html Sent from the Solr - User mailing list archive at Nabble.com. -- Thanks Regards, Isan Fulia.
Re: Solr Out of Memory Error
Hi Isan, It seems your index size 25GB si much more compared to you have total Ram size is 4GB. You have to do 2 things to avoid Out Of Memory Problem. 1-Buy more Ram ,add at least 12 GB of more ram. 2-Increase the Memory allocated to solr by setting XMX values.at least 12 GB allocate to solr. But if your all index will fit into the Cache memory it will give you the better result. Also add more servers to load balance as your QPS is high. Your 7 Laks data makes 25 GB of index its looking quite high.Try to lower the index size What are you indexing in your 25GB of index? - Thanx: Grijesh -- View this message in context: http://lucene.472066.n3.nabble.com/Solr-Out-of-Memory-Error-tp2280037p2285779.html Sent from the Solr - User mailing list archive at Nabble.com.