Also make sure you don't have any autocommit rules enabled in solrconfig.xml
How many documents are in the 400MB CSV file, and how long does it
take to index now?
-Yonik
http://www.lucidimagination.com
On Tue, Jul 7, 2009 at 10:03 AM, Anand Kumar
Prabhakar wrote:
>
> Hi Yonik,
>
> Currently ou
Hi Yonik,
Currently our Schema has very few fields and we don't have any copy fields
also. Please find the below Schema.xml we are using:
On Tue, Jul 7, 2009 at 9:14 AM, Anand Kumar
Prabhakar wrote:
> I want to know is there any method to do
> it much faster, we have overcome the OutOfMemoryException by increasing heap
> space.
Optimize your schema - eliminate all unnecessary copyFields and
default values. The current example schem
Thank you for the Reply Yonik, I have already tried with smaller CSV files,
currently we are trying to load a CSV file of 400 MB but this is taking too
much time(more than half an hour). I want to know is there any method to do
it much faster, we have overcome the OutOfMemoryException by increasin
On Tue, Jul 7, 2009 at 8:41 AM, Anand Kumar
Prabhakar wrote:
> Is there any way so that we can read the data from the
> CSV file and load it into the Solr database without using "/update/csv"
That *is* the right way to load a CSV file into Solr.
How many records are in the CSV file, and how much h