Thanks For the help.

Looks like I've managed to get some semblance of this working. 
The indexes are much faster, but the RAM usage by SolrJ is quite high. Is it
normal to see around 6GB of RAM usage?
(My test is indexing 250,000 records with the 50 child entities)

In short, I'm running through a loop against a DB 50 times (to mimic 50
entities) and adding the results to a Map, then using that map to loop
through and commit values to Solr.


Jörn Franke wrote
> Ideally you use scripts that can use JVM/Java - in this way you can always
> use the latest SolrJ client library but also other libraries that are
> relevant (eg Tika for unstructured content).
> This does not have to be Java directly but can be based also on Scala or
> JVM script languages, such as Groovy.
> 
> There are also wrappers for Python etc, but those may not always leverage
> the latest version of the library.





--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html

Reply via email to