Re: Out of memory errors with Spatial indexing

2020-07-06 Thread David Smiley
I believe you are experiencing this bug: LUCENE-5056 The fix would probably be adjusting code in here org.apache.lucene.spatial.query.SpatialArgs#calcDistanceFromErrPct ~ David Smiley Apache Lucene/Solr Search Developer

Re: Out of memory errors with Spatial indexing

2020-07-06 Thread Sunil Varma
Hi David Thanks for your response. Yes, I noticed that all the data causing issue were at the poles. I tried the "RptWithGeometrySpatialField" field type definition but get a "Spatial context does not support S2 spatial index"error. Setting "spatialContextFactory="Geo3D" I still see the original

Re: Out of memory errors with Spatial indexing

2020-07-03 Thread David Smiley
Hi Sunil, Your shape is at a pole, and I'm aware of a bug causing an exponential explosion of needed grid squares when you have polygons super-close to the pole. Might you try S2PrefixTree instead? I forget if this would fix it or not by itself. For indexing non-point data, I recommend

Re: Out of Memory Errors

2017-06-14 Thread Susheel Kumar
The attachment will not come thru. Can you upload thru dropbox / other sharing sites etc. On Wed, Jun 14, 2017 at 12:41 PM, Satya Marivada wrote: > Susheel, Please see attached. There heap towards the end of graph has > spiked > > > > On Wed, Jun 14, 2017 at 11:46

Re: Out of Memory Errors

2017-06-14 Thread Satya Marivada
Susheel, Please see attached. There heap towards the end of graph has spiked On Wed, Jun 14, 2017 at 11:46 AM Susheel Kumar wrote: > You may have gc logs saved when OOM happened. Can you draw it in GC Viewer > or so and share. > > Thnx > > On Wed, Jun 14, 2017 at 11:26

Re: Out of Memory Errors

2017-06-14 Thread Susheel Kumar
You may have gc logs saved when OOM happened. Can you draw it in GC Viewer or so and share. Thnx On Wed, Jun 14, 2017 at 11:26 AM, Satya Marivada wrote: > Hi, > > I am getting Out of Memory Errors after a while on solr-6.3.0. > The

Re: Out of memory error during full import

2016-02-04 Thread Shawn Heisey
On 2/4/2016 12:18 AM, Srinivas Kashyap wrote: > I have implemented 'SortedMapBackedCache' in my SqlEntityProcessor for the > child entities in data-config.xml. When i try to do full import, i'm getting > OutOfMemory error(Java Heap Space). I increased the HEAP allocation to the > maximum extent

Re: out of memory when trying to sort by id in a 1.5 billion index

2014-11-07 Thread Yago Riveiro
For sorting DocValues are the best option I think. — /Yago Riveiro On Fri, Nov 7, 2014 at 12:45 PM, adfel70 adfe...@gmail.com wrote: hi I have 11 machines in my cluster. each machine 128GB memory, 2 solr jvm's with 12gb heap each. cluster has 7 shard, 3 replicas. 1.5 billion docs total.

Re: out of memory when trying to sort by id in a 1.5 billion index

2014-11-07 Thread Chris Hostetter
: For sorting DocValues are the best option I think. yep, definitely a good idea. : I have a usecase for using cursorpage and when tried to check this, I got : outOfMemory just for sorting by id. what does the field/fieldType for your uniqueKey field look like? If you aren't using

Re: Out of Memory when i downdload 5 Million records from sqlserver to solr

2014-07-02 Thread Shawn Heisey
On 7/1/2014 4:57 AM, mskeerthi wrote: I have to download my 5 million records from sqlserver to solr into one index. I am getting below exception after downloading 1 Million records. Is there any configuration or another to download from sqlserver to solr. Below is the exception i am getting

Re: Out of Memory when i downdload 5 Million records from sqlserver to solr

2014-07-01 Thread Aman Tandon
You can try gave some more memory to solr On Jul 1, 2014 4:41 PM, mskeerthi mskeer...@gmail.com wrote: I have to download my 5 million records from sqlserver to solr into one index. I am getting below exception after downloading 1 Million records. Is there any configuration or another to

Re: Out of Memory when i downdload 5 Million records from sqlserver to solr

2014-07-01 Thread IJ
We faced similar problems on our side. We found it more reliable to have a mechanism to extract all data from the Database into a flat file - and then use a JAVA program to bulk index into Solr from the file via SolrJ API. -- View this message in context:

RE: out of memory during indexing do to large incoming queue

2013-06-17 Thread Yoni Amir
: Shawn Heisey [mailto:s...@elyograg.org] Sent: Monday, June 03, 2013 17:08 To: solr-user@lucene.apache.org Subject: Re: out of memory during indexing do to large incoming queue On 6/3/2013 1:06 AM, Yoni Amir wrote: Solrconfig.xml - http://apaste.info/dsbv Schema.xml - http://apaste.info/67PI

Re: out of memory during indexing do to large incoming queue

2013-06-17 Thread Shawn Heisey
On 6/17/2013 4:32 AM, Yoni Amir wrote: I was wondering about your recommendation to use facet.method=enum? Can you explain what is the trade-off here? I understand that I gain a benefit by using less memory, but what with I lose? Is it speed? The problem with facet.method=fc (the default)

RE: out of memory during indexing do to large incoming queue

2013-06-03 Thread Yoni Amir
mean that's the number I saw in Solr UI. I assume there were much more files than that. Thanks, Yoni -Original Message- From: Shawn Heisey [mailto:s...@elyograg.org] Sent: Sunday, June 02, 2013 22:53 To: solr-user@lucene.apache.org Subject: Re: out of memory during indexing do to large

Re: out of memory during indexing do to large incoming queue

2013-06-02 Thread Shreejay
A couple of things: 1) can you give some more details about your setup ? Like whether its cloud or single instance . How many nodes if its cloud. The hardware - memory per machine , JVM options. Etc 2) any specific reason for using 4.0 beta? The latest version is 4.3. I used 4.0 for a few

Re: out of memory during indexing do to large incoming queue

2013-06-02 Thread Shawn Heisey
On 6/2/2013 8:16 AM, Yoni Amir wrote: Hello, I am receiving OutOfMemoryError during indexing, and after investigating the heap dump, I am still missing some information, and I thought this might be a good place for help. I am using Solr 4.0 beta, and I have 5 threads that send update

RE: out of memory during indexing do to large incoming queue

2013-06-02 Thread Yoni Amir
number of segments? Is it possible? Thanks again, Yoni -Original Message- From: Shawn Heisey [mailto:s...@elyograg.org] Sent: Sunday, June 02, 2013 18:05 To: solr-user@lucene.apache.org Subject: Re: out of memory during indexing do to large incoming queue On 6/2/2013 8:16 AM, Yoni Amir

Re: out of memory during indexing do to large incoming queue

2013-06-02 Thread Shawn Heisey
On 6/2/2013 12:25 PM, Yoni Amir wrote: Hi Shawn and Shreejay, thanks for the response. Here is some more information: 1) The machine is a virtual machine on ESX server. It has 4 CPUs and 8GB of RAM. I don't remember what CPU but something modern enough. It is running Java 7 without any

Re: Out of memory on some faceting queries

2013-04-08 Thread Dotan Cohen
On Wed, Apr 3, 2013 at 8:47 PM, Shawn Heisey s...@elyograg.org wrote: On 4/2/2013 3:09 AM, Dotan Cohen wrote: I notice that this only occurs on queries that run facets. I start Solr with the following command: sudo nohup java -XX:NewRatio=1 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC

Re: Out of memory on some faceting queries

2013-04-03 Thread Toke Eskildsen
On Tue, 2013-04-02 at 17:08 +0200, Dotan Cohen wrote: Most of the time I facet on one field that has about twenty unique values. They are likely to be disk cached so warming those for 9M documents should only take a few seconds. However, once per day I would like to facet on the text field,

Re: Out of memory on some faceting queries

2013-04-03 Thread Dotan Cohen
On Tue, Apr 2, 2013 at 6:26 PM, Andre Bois-Crettez andre.b...@kelkoo.com wrote: warmupTime is available on the admin page for each type of cache (in milliseconds) : http://solr-box:8983/solr/#/core1/plugins/cache Or if you are only interested in the total :

Re: Out of memory on some faceting queries

2013-04-03 Thread Dotan Cohen
On Wed, Apr 3, 2013 at 10:11 AM, Toke Eskildsen t...@statsbiblioteket.dk wrote: However, once per day I would like to facet on the text field, which is a free-text field usually around 1 KiB (about 100 words), in order to determine what the top keywords / topics are. That query would take up

Re: Out of memory on some faceting queries

2013-04-03 Thread Shawn Heisey
On 4/2/2013 3:09 AM, Dotan Cohen wrote: I notice that this only occurs on queries that run facets. I start Solr with the following command: sudo nohup java -XX:NewRatio=1 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:+CMSParallelRemarkEnabled -Dsolr.solr.home=/mnt/SolrFiles100/solr -jar

Re: Out of memory on some faceting queries

2013-04-02 Thread Toke Eskildsen
On Tue, 2013-04-02 at 11:09 +0200, Dotan Cohen wrote: On some queries I get out of memory errors: {error:{msg:java.lang.OutOfMemoryError: Java heap [...] org.apache.lucene.index.DocTermOrds.uninvert(DocTermOrds.java:273)\n\tat

Re: Out of memory on some faceting queries

2013-04-02 Thread Dotan Cohen
On Tue, Apr 2, 2013 at 12:59 PM, Toke Eskildsen t...@statsbiblioteket.dk wrote: How many documents does your index have, how many fields do you facet on and approximately how many unique values does your facet fields have? 8971763 documents, growing at a rate of about 500 per minute. We

Re: Out of memory on some faceting queries

2013-04-02 Thread Toke Eskildsen
On Tue, 2013-04-02 at 12:16 +0200, Dotan Cohen wrote: 8971763 documents, growing at a rate of about 500 per minute. We actually expect that to be ~5 per minute once we get out of testing. 9M documents in a heavily updated index with faceting. Maybe you are committing faster than the

Re: Out of memory on some faceting queries

2013-04-02 Thread Dotan Cohen
On Tue, Apr 2, 2013 at 2:41 PM, Toke Eskildsen t...@statsbiblioteket.dk wrote: 9M documents in a heavily updated index with faceting. Maybe you are committing faster than the faceting can be prepared?

Re: Out of memory on some faceting queries

2013-04-02 Thread Toke Eskildsen
On Tue, 2013-04-02 at 15:55 +0200, Dotan Cohen wrote: [Tokd: maxWarmingSearchers limit exceeded?] Thank you Toke, this is exactly on my list of things to learn about Solr. We do get the error mentioned and we cannot reduce the amount of commits. Also, I do believe that we have the necessary

Re: Out of memory on some faceting queries

2013-04-02 Thread Dotan Cohen
On Tue, Apr 2, 2013 at 5:33 PM, Toke Eskildsen t...@statsbiblioteket.dk wrote: On Tue, 2013-04-02 at 15:55 +0200, Dotan Cohen wrote: [Tokd: maxWarmingSearchers limit exceeded?] Thank you Toke, this is exactly on my list of things to learn about Solr. We do get the error mentioned and we

Re: Out of memory on some faceting queries

2013-04-02 Thread Dotan Cohen
On Tue, Apr 2, 2013 at 5:33 PM, Toke Eskildsen t...@statsbiblioteket.dk wrote: Memory does not help you if you commit too frequently. If you commit each X seconds and warming takes X+Y seconds, then you will run out of memory at some point. How might I time the warming? I've been googling

Re: Out of memory on some faceting queries

2013-04-02 Thread Dotan Cohen
How often do you commit and how many unique values does your facet fields have? Most of the time I facet on one field that has about twenty unique values. However, once per day I would like to facet on the text field, which is a free-text field usually around 1 KiB (about 100 words), in order

Re: Out of memory on some faceting queries

2013-04-02 Thread Andre Bois-Crettez
On 04/02/2013 05:04 PM, Dotan Cohen wrote: How might I time the warming? I've been googling warming since your earlier message but there does not seem to be any really good documentation on the subject. If there is anything that you feel I should be reading I would appreciate a link or a keyword

Re: Out of Memory doing a query Solr 4.2

2013-03-15 Thread Bernd Fehling
We are currently using Oracle Corporation Java HotSpot(TM) 64-Bit Server VM (1.7.0_07 23.3-b01) Runs excellent and also no memory parameter tweaking neccessary. Give enough physical and JVM memory, use -XX:+UseG1GC and thats it. Also no saw tooth and GC timeouts from JVM as with earlier

Re: Out of Memory doing a query Solr 4.2

2013-03-15 Thread Robert Muir
On Fri, Mar 15, 2013 at 6:46 AM, raulgrande83 raulgrand...@hotmail.com wrote: Thank you for your help. I'm afraid it won't be so easy to change de jvm version, because it is required at the moment. It seems that Solr 4.2 supports Java 1.6 at least. Is that correct? Could you find any clue of

Re: Out of Memory doing a query Solr 4.2

2013-03-14 Thread Robert Muir
On Thu, Mar 14, 2013 at 12:07 PM, raulgrande83 raulgrand...@hotmail.com wrote: JVM: IBM J9 VM(1.6.0.2.4) I don't recommend using this JVM.

Re: Out Of Memory =( Too many cores on one server?

2012-11-21 Thread Shawn Heisey
On 11/21/2012 12:36 AM, stockii wrote: okay. i will try out more RAM. i am using not much caching because of near-realt-time-search. in this case its better to increase xmn or only xmx and xms? I have personally found that increasing the size of the young generation (Eden) is beneficial to

Re: Out Of Memory =( Too many cores on one server?

2012-11-21 Thread Mark Miller
I have personally found that increasing the size of the young generation (Eden) is beneficial to Solr, I've seen the same thing - I think it's because requests create a lot of short lived objects and if the eden is not large enough, a lot of those objects will make it to the tenured space,

Re: Out Of Memory =( Too many cores on one server?

2012-11-16 Thread Bernd Fehling
I guess you should give JVM more memory. When starting to find a good value for -Xmx I oversized and set it to Xmx20G and Xms20G. Then I monitored the system and saw that JVM is between 5G and 10G (java7 with G1 GC). Now it is finally set to Xmx11G and Xms11G for my system with 1 core and 38

Re: Out Of Memory =( Too many cores on one server?

2012-11-16 Thread Vadim Kisselmann
Hi, your JVM need more RAM. My setup works well with 10 Cores, and 300mio. docs, Xmx8GB Xms8GB, 16GB for OS. But it's how Bernd mentioned, the memory consumption depends on the number of fields and the fieldCache. Best Regards Vadim 2012/11/16 Bernd Fehling bernd.fehl...@uni-bielefeld.de: I

Re: Out of Memory

2012-01-31 Thread Erick Erickson
Right. Mutlivalued fields use fieldCache for faceting (as I remember) whereas single valued fields don't under some circumstances. See: http://wiki.apache.org/solr/SolrCaching#The_Lucene_FieldCache Before your change, you were probably using the filterCache for what faceting you were doing. So

Re: Out of memory during the indexing

2011-12-06 Thread Erick Erickson
I'm going to defer to the folks who actually know the guts here. If you've turned down the cache entries for your Solr caches, you're pretty much left with Lucene caching which is a mystery... Best Erick On Mon, Dec 5, 2011 at 9:23 AM, Jeff Crump jeffrey.cr...@gmail.com wrote: Yes, and without

Re: Out of memory during the indexing

2011-12-05 Thread Erick Erickson
There's no good way to say to Solr Use only this much memory for searching. You can certainly limit the size somewhat by configuring your caches to be small. But if you're sorting, then Lucene will use up some cache space etc. Are you actually running into problems? Best Erick On Fri, Dec 2,

Re: Out of memory during the indexing

2011-12-05 Thread Jeff Crump
Yes, and without doing much in the way of queries, either. Basically, our test data has large numbers of distinct terms, each of which can be large in themselves. Heap usage is a straight line -- up -- 75 percent of the heap is consumed with byte[] allocations at the leaf of an object graph

Re: Out of memory during the indexing

2011-12-02 Thread Jeff Crump
Can anyone advise techniques for limiting the size of the RAM buffers to begin with? As the index grows, I shouldn't have to keep increasing the heap. We have a high-ingest, low-query-rate environment and I'm not as much concerned with the query-time caches as I am with the segment core

Re: Out of memory, not during import or updates of the index

2011-11-10 Thread Andre Bois-Crettez
Using Solr 3.4.0. That changelog actually says it should reduce memory usage for that version. We were on a much older version previously, 1.something. Norms are off on all fields that it can be turned off on. I'm just hoping this new version doesn't have any leaks. Does FastLRUCache vs

Re: Out of memory, not during import or updates of the index

2011-11-10 Thread Paul Libbrecht
Steve, do you have any custom code in your Solr? We had out-of-memory errors just because of that, I was using one method to obtain the request which was leaking... had not read javadoc carefully enough. Since then, no leak. What do you do after the OoME? paul Le 9 nov. 2011 à 21:33, Steve

Re: Out of memory, not during import or updates of the index

2011-11-10 Thread Steve Fatula
From: Paul Libbrecht p...@hoplahup.net To: solr-user@lucene.apache.org Sent: Thursday, November 10, 2011 7:19 AM Subject: Re: Out of memory, not during import or updates of the index do you have any custom code in your Solr? We had out-of-memory errors just because of that, I was using one

Re: Out of memory, not during import or updates of the index

2011-11-10 Thread Steve Fatula
From: Andre Bois-Crettez andre.b...@kelkoo.com To: solr-user@lucene.apache.org solr-user@lucene.apache.org Sent: Thursday, November 10, 2011 7:02 AM Subject: Re: Out of memory, not during import or updates of the index You can add JVM parameters to better trace the heap usage with -XX

Re: Out of memory, not during import or updates of the index

2011-11-10 Thread Mark Miller
How big is your index? What kind of queries do you tend to see? Do you facet on a lot of fields? Sort on a lot of fields? Before you get the OOM and are running along nicely, how much RAM is used? On Nov 9, 2011, at 3:33 PM, Steve Fatula wrote: We get at rare times out of memory errors

Re: Out of memory, not during import or updates of the index

2011-11-10 Thread Steve Fatula
From: Mark Miller markrmil...@gmail.com To: solr-user solr-user@lucene.apache.org Sent: Thursday, November 10, 2011 3:00 PM Subject: Re: Out of memory, not during import or updates of the index How big is your index? The total for the data dir is 651M. What kind of queries do you tend to see

Re: Out of memory during the indexing

2011-11-09 Thread Andre Bois-Crettez
How much memory you actually allocate to the JVM ? http://wiki.apache.org/solr/SolrPerformanceFactors#Memory_allocated_to_the_Java_VM You need to increase the -Xmx value, otherwise your large ram buffers won't fit in the java heap. sivaprasad wrote: Hi, I am getting the following error

Re: Out of memory, not during import or updates of the index

2011-11-09 Thread Otis Gospodnetic
Hi, Some options: * Yes, on the slave/search side you can reduce your cache sizes and lower the memory footprint. * You can also turn off norms in various fields if you don't need that and save memory there. * You can increase your Xmx I don't know what version of Solr you have, but look

Re: Out of memory, not during import or updates of the index

2011-11-09 Thread Steve Fatula
From: Otis Gospodnetic otis_gospodne...@yahoo.com To: solr-user@lucene.apache.org solr-user@lucene.apache.org Sent: Wednesday, November 9, 2011 2:51 PM Subject: Re: Out of memory, not during import or updates of the index Hi, Some options: * Yes, on the slave/search side you can reduce your

RE: Out of memory

2011-09-16 Thread Chris Hostetter
: Actually I am storing twitter streaming data into the core, so the rate of : index is about 12tweets(docs)/second. The same solr contains 3 other cores ... : . At any given time I dont need data more than past 15 days, unless : someone queries for it explicetly. How can this be

Re: Out of memory

2011-09-16 Thread Luis Cappa Banda
Hello. Facet queries are slower than others specially when you are working with a 69G index. I would like to know more about the context where occurs the Out of memory exception: is it during an indexation? Do you index at the same time as users launches queries to twitter index? Are you using

RE: Out of memory

2011-09-16 Thread Rohit
no idea about sharding right now, if you could point me to some resource for date wise sharding. Regards, Rohit -Original Message- From: Chris Hostetter [mailto:hossman_luc...@fucit.org] Sent: 17 September 2011 00:19 To: solr-user@lucene.apache.org Subject: RE: Out of memory : Actually I am

RE: Out of memory

2011-09-16 Thread Rohit
Me: http://about.me/rohitg -Original Message- From: Luis Cappa Banda [mailto:luisca...@gmail.com] Sent: 17 September 2011 03:38 To: solr-user@lucene.apache.org Subject: Re: Out of memory Hello. Facet queries are slower than others specially when you are working with a 69G index. I

Re: Out of memory

2011-09-15 Thread Dmitry Kan
again Rohit -Original Message- From: Dmitry Kan [mailto:dmitry@gmail.com] Sent: 14 September 2011 10:23 To: solr-user@lucene.apache.org Subject: Re: Out of memory Hi, OK 64GB fits into one shard quite nicely in our setup. But I have never used multicore setup. In total you

RE: Out of memory

2011-09-15 Thread Rohit
It's happening more in search and search has become very slow particularly on the core with 69GB index data. Regards, Rohit -Original Message- From: Dmitry Kan [mailto:dmitry@gmail.com] Sent: 15 September 2011 07:51 To: solr-user@lucene.apache.org Subject: Re: Out of memory Hello

Re: Out of memory

2011-09-15 Thread Dmitry Kan
: Re: Out of memory Hello, Since you use caching, you can monitor the eviction parameter on the solr admin page (http://localhost:port/solr/admin/stats.jsp#cache). If it is non zero, the cache can be made e.g. bigger. queryResultWindowSize=50 in my case. Not sure, if solr 3.1 supports

RE: Out of memory

2011-09-15 Thread Rohit
Thanks Dmitry, let me look into sharading concepts. Regards, Rohit Mobile: +91-9901768202 About Me: http://about.me/rohitg -Original Message- From: Dmitry Kan [mailto:dmitry@gmail.com] Sent: 15 September 2011 10:15 To: solr-user@lucene.apache.org Subject: Re: Out of memory If you

RE: Out of memory

2011-09-14 Thread Rohit
Message- From: Jaeger, Jay - DOT [mailto:jay.jae...@dot.wi.gov] Sent: 13 September 2011 21:06 To: solr-user@lucene.apache.org Subject: RE: Out of memory numDocs is not the number of documents in memory. It is the number of documents currently in the index (which is kept on disk). Same goes

Re: Out of memory

2011-09-14 Thread Dmitry Kan
: +91-9901768202 About Me: http://about.me/rohitg -Original Message- From: Jaeger, Jay - DOT [mailto:jay.jae...@dot.wi.gov] Sent: 13 September 2011 21:06 To: solr-user@lucene.apache.org Subject: RE: Out of memory numDocs is not the number of documents in memory. It is the number

RE: Out of memory

2011-09-14 Thread Rohit
a jconsole to my solr as suggested to get a better picture. Regards, Rohit -Original Message- From: Dmitry Kan [mailto:dmitry@gmail.com] Sent: 14 September 2011 08:15 To: solr-user@lucene.apache.org Subject: Re: Out of memory Hi Rohit, Do you use caching? How big is your index

Re: Out of memory

2011-09-14 Thread Dmitry Kan
: Dmitry Kan [mailto:dmitry@gmail.com] Sent: 14 September 2011 08:15 To: solr-user@lucene.apache.org Subject: Re: Out of memory Hi Rohit, Do you use caching? How big is your index in size on the disk? What is the stack trace contents? The OOM problems that we have seen so far were related

RE: Out of memory

2011-09-14 Thread Rohit
Subject: Re: Out of memory Hi, OK 64GB fits into one shard quite nicely in our setup. But I have never used multicore setup. In total you have 79,9 GB. We try to have 70-100GB per shard with caching on. Do you do warming up of your index on starting? Also, there was a setting of pre-populating the cache

RE: Out of memory

2011-09-13 Thread Jaeger, Jay - DOT
numDocs is not the number of documents in memory. It is the number of documents currently in the index (which is kept on disk). Same goes for maxDocs, except that it is a count of all of the documents that have ever been in the index since it was created or optimized (including deleted

RE: Out of memory on sorting

2011-05-26 Thread pravesh
For saving Memory: 1. allocate as much memory to the JVM (especially if you are using 64bit OS) 2. You can set omitNorms=true for your date id fields (actually for all fields where index-time boosting length normalization isn't required. This will require a full reindex) 3. Are you sorting on

Re: Out of memory on sorting

2011-05-19 Thread rajini maski
Explicit Warming of Sort Fields If you do a lot of field based sorting, it is advantageous to add explicitly warming queries to the newSearcher and firstSearcher event listeners in your solrconfig which sort on those fields, so the FieldCache is populated prior to any queries being executed by

RE: Out of memory on sorting

2011-05-19 Thread Rohit
this generic? -Rohit -Original Message- From: rajini maski [mailto:rajinima...@gmail.com] Sent: 19 May 2011 14:53 To: solr-user@lucene.apache.org Subject: Re: Out of memory on sorting Explicit Warming of Sort Fields If you do a lot of field based sorting, it is advantageous to add

Re: Out of memory on sorting

2011-05-19 Thread Erick Erickson
and the possibilities of queries unlimited. How can I make this generic? -Rohit -Original Message- From: rajini maski [mailto:rajinima...@gmail.com] Sent: 19 May 2011 14:53 To: solr-user@lucene.apache.org Subject: Re: Out of memory on sorting Explicit Warming of Sort Fields If you do a lot

RE: Out of memory on sorting

2011-05-19 Thread Rohit
, this is proving to be a big set back. Help would be greatly appreciated. Regards, Rohit -Original Message- From: Erick Erickson [mailto:erickerick...@gmail.com] Sent: 19 May 2011 18:21 To: solr-user@lucene.apache.org Subject: Re: Out of memory on sorting The warming queries warm up

Re: Out of memory on sorting

2011-05-19 Thread Erick Erickson
would be greatly appreciated. Regards, Rohit -Original Message- From: Erick Erickson [mailto:erickerick...@gmail.com] Sent: 19 May 2011 18:21 To: solr-user@lucene.apache.org Subject: Re: Out of memory on sorting The warming queries warm up the caches used in sorting. So just

Re: Out of memory while creating indexes

2011-03-04 Thread Praveen Parameswaran
Hi , post.sh is using curl as I see , will that be helpful? On Fri, Mar 4, 2011 at 1:24 PM, Upayavira u...@odoko.co.uk wrote: post.jar is intended for demo purposes, not production use, so it doesn;t surprise me you've managed to break it. Have you tried using curl to do the post?

Re: Out of memory while creating indexes

2011-03-03 Thread Gora Mohanty
On Fri, Mar 4, 2011 at 3:32 AM, Solr User solr...@gmail.com wrote: Hi All, I am trying to create indexes out of a 400MB XML file using the following command and I am running into out of memory exception. Is this a single record in the XML file? If it is more than one, breaking it up into

Re: Out of memory while creating indexes

2011-03-03 Thread Upayavira
post.jar is intended for demo purposes, not production use, so it doesn;t surprise me you've managed to break it. Have you tried using curl to do the post? Upayavira On Thu, 03 Mar 2011 17:02 -0500, Solr User solr...@gmail.com wrote: Hi All, I am trying to create indexes out of a 400MB XML

Re: Out of memory error

2010-12-07 Thread Erick Erickson
Have you seen this page? http://wiki.apache.org/solr/DataImportHandlerFaq http://wiki.apache.org/solr/DataImportHandlerFaqSee especially batchsize, but it looks like you're already on to that. Do you have any idea how big the records are in the database? You might try adjusting the rambuffersize

Re: Out of memory error

2010-12-07 Thread Fuad Efendi
Related: SOLR-846 Sent on the TELUS Mobility network with BlackBerry -Original Message- From: Erick Erickson erickerick...@gmail.com Date: Tue, 7 Dec 2010 08:11:41 To: solr-user@lucene.apache.org Reply-To: solr-user@lucene.apache.org Subject: Re: Out of memory error Have you seen

Re: Out of memory error

2010-12-06 Thread Fuad Efendi
Batch size -1??? Strange but could be a problem. Note also you can't provide parameters to default startup.sh command; you should modify setenv.sh instead --Original Message-- From: sivaprasad To: solr-user@lucene.apache.org ReplyTo: solr-user@lucene.apache.org Subject: Out of memory

RE: Out of Memory

2010-03-23 Thread Craig Christman
Is this on Oracle 10.2.0.4? Looking at the Oracle support site there's a memory leak using some of the XML functions that can be fixed by upgrading to 10.2.0.5, 11.2, or by using 10.2.0.4 Patch 2 in Windows 32-bit. -Original Message- From: Neil Chaudhuri

RE: Out of Memory

2010-03-23 Thread Dennis Gearon
: Craig Christman cchrist...@caci.com Subject: RE: Out of Memory To: solr-user@lucene.apache.org solr-user@lucene.apache.org Date: Tuesday, March 23, 2010, 1:01 PM Is this on Oracle 10.2.0.4?  Looking at the Oracle support site there's a memory leak using some of the XML functions that can be fixed

Re: Out of Memory Errors

2008-10-22 Thread Nick Jenkin
Have you confirmed Java's -Xmx setting? (Max memory) e.g. java -Xmx2000MB -jar start.jar -Nick On Wed, Oct 22, 2008 at 3:24 PM, Mark Miller [EMAIL PROTECTED] wrote: How much RAM in the box total? How many sort fields and what types? Sorts on each core? Willie Wong wrote: Hello, I've been

RE: Out of Memory Errors

2008-10-22 Thread r.prieto
Hi Willie, Are you using highliting ??? If, the response is yes, you need to know that for each document retrieved, the solr highliting load into memory the full field who is using for this functionality. If the field is too long, you have problems with memory. You can solve the problem using

Re: Out of Memory Errors

2008-10-22 Thread Jae Joo
Here is what I am doing to check the memory statues. 1. Run the Servelt and Solr application. 2. On command prompt, jstat -gc pid 5s (5s means that getting data every 5 seconds.) 3. Watch it or pipe to the file. 4. Analyze the data gathered. Jae On Tue, Oct 21, 2008 at 9:48 PM, Willie Wong

Re: Out of Memory Errors

2008-10-21 Thread Mark Miller
How much RAM in the box total? How many sort fields and what types? Sorts on each core? Willie Wong wrote: Hello, I've been having issues with out of memory errors on searches in Solr. I was wondering if I'm hitting a limit with solr or if I've configured something seriously wrong. Solr

RE: Out of memory on Solr sorting

2008-08-05 Thread sundar shankar
From: [EMAIL PROTECTED] To: solr-user@lucene.apache.org Subject: RE: Out of memory on Solr sorting Date: Tue, 29 Jul 2008 10:43:05 -0700 A sneaky source of OutOfMemory errors is the permanent generation. If you add this: -XX:PermSize=64m -XX:MaxPermSize=96m You will increase the size

RE: Out of memory on Solr sorting

2008-08-05 Thread Fuad Efendi
Hi Sundar, If increasing LRU cache helps you: - you are probably using 'tokenized' field for sorting (could you confirm please?)... ...you should use 'non-tokenized single-valued non-boolean' for better performance of sorting... Fuad Efendi == http://www.tokenizer.org

RE: Out of memory on Solr sorting

2008-08-05 Thread sundar shankar
The field is of type text_ws. Is this not recomended. Should I use text instead? Date: Tue, 5 Aug 2008 10:58:35 -0700 From: [EMAIL PROTECTED] To: [EMAIL PROTECTED] Subject: RE: Out of memory on Solr sorting Hi Sundar, If increasing LRU cache helps you: - you are probably using

RE: Out of memory on Solr sorting

2008-08-05 Thread Fuad Efendi
My understanding of Lucene Sorting is that it will sort by 'tokens' and not by 'full fields'... so that for sorting you need 'full-string' (non-tokenized) field, and to search you need another one tokenized. For instance, use 'string' for sorting, and 'text_ws' for search; and use

RE: Out of memory on Solr sorting

2008-08-05 Thread Fuad Efendi
Best choice for sorting field: !-- This is an example of using the KeywordTokenizer along With various TokenFilterFactories to produce a sortable field that does not include some properties of the source text -- fieldType name=alphaOnlySort class=solr.TextField

Re: Out of memory on Solr sorting

2008-08-05 Thread Yonik Seeley
On Tue, Aug 5, 2008 at 1:59 PM, Fuad Efendi [EMAIL PROTECTED] wrote: If increasing LRU cache helps you: - you are probably using 'tokenized' field for sorting (could you confirm please?)... Sorting does not utilize any Solr caches. -Yonik

Re: Out of memory on Solr sorting

2008-08-05 Thread Fuad Efendi
I know, and this is strange... I was guessing filterCache is used implicitly to get DocSet for token; as Sundar wrote, increase of LRUCache helped him (he is sorting on 'text-ws' field) -Fuad If increasing LRU cache helps you: - you are probably using 'tokenized' field for sorting (could you

RE: Out of memory on Solr sorting

2008-08-05 Thread sundar shankar
just fine. after that. I am currently reinexing, replaving the text_ws to string and having the default size of all 3 caches to 512 and seeing if the problem goes away. -Sundar Date: Tue, 5 Aug 2008 14:05:05 -0700 From: [EMAIL PROTECTED] To: solr-user@lucene.apache.org Subject: Re: Out

RE: Out of memory on Solr sorting

2008-08-05 Thread Fuad Efendi
Sundar, very strange that increase of size/initialSize of LRUCache helps with OutOfMemoryError... 2048 is number of entries in cache and _not_ 2Gb of memory... Making size==initialSize of HashMap-based LRUCache would help with performance anyway; may be with OOMs (probably no need to resize

RE: Out of memory on Solr sorting

2008-08-05 Thread sundar shankar
Oh Wow, I didnt know that was the case. I am completely left baffled now. BAck to square one I guess. :) Date: Tue, 5 Aug 2008 14:31:28 -0700 From: [EMAIL PROTECTED] To: solr-user@lucene.apache.org Subject: RE: Out of memory on Solr sorting Sundar, very strange that increase of size

RE: Out of memory on Solr sorting

2008-07-29 Thread Lance Norskog
that is not reclaimed, and so each undeploy/redeploy cycle eats up the permanent generation pool. -Original Message- From: david w [mailto:[EMAIL PROTECTED] Sent: Tuesday, July 29, 2008 7:20 AM To: solr-user@lucene.apache.org Subject: Re: Out of memory on Solr sorting Hi, Daniel I got

RE: Out of memory on Solr sorting

2008-07-23 Thread Daniel Alheiros
@lucene.apache.org Subject: RE: Out of memory on Solr sorting Yes, it is a cache, it stores sorted by sorted field array of Document IDs together with sorted fields; query results can intersect with it and reorder accordingly. But memory requirements should be well documented. It uses internally WeakHashMap

RE: Out of memory on Solr sorting

2008-07-23 Thread sundar shankar
-Xmx2048m -XX:MinHeapFreeRatio=50 -XX:NewSize=1024m -XX:NewRatio=2 -Dsun.rmi.dgc.client.gcInterval=360 -Dsun.rmi.dgc.server.gcInterval=360 Jboss 4.05 Subject: RE: Out of memory on Solr sorting Date: Wed, 23 Jul 2008 10:49:06 +0100 From: [EMAIL PROTECTED] To: solr-user

RE: Out of memory on Solr sorting

2008-07-22 Thread sundar shankar
From: [EMAIL PROTECTED] To: solr-user@lucene.apache.org Subject: Out of memory on Solr sorting Date: Tue, 22 Jul 2008 19:11:02 + Hi, Sorry again fellos. I am not sure whats happening. The day with solr is bad for me I guess. EZMLM didnt let me send any mails this morning. Asked

  1   2   >