We just started using SOLR. I am trying to load a single file with 20 million
records into SOLR using the CSV uploader. I keep getting and out of Memory
after loading 7 million records. Here is the config:

<autoCommit> 
         <maxDocs>10000</maxDocs>
         <maxTime>60000</maxTime> 
I also  encountered a LockObtainFailedException
org.apache.lucene.store.LockObtainFailedException: Lock obtain timed out:
NativeFSLock@D:\work\solr\.\data\index\write.lock
        at org.apache.lucene.store.Lock.obtain(Lock.java:84)
        at
org.apache.lucene.index.IndexWriter.&lt;init&gt;(IndexWriter.java:1097)

So I changed the  lockType to SIngle, now again I am getting an Out of
Memory Exception. I also increased the JVM heap space to 2048M but still
getting an Out of Memory.




--
View this message in context: 
http://lucene.472066.n3.nabble.com/SOlR-Out-of-Memory-exception-tp3074636p3074636.html
Sent from the Solr - User mailing list archive at Nabble.com.

Reply via email to