My recommendation in these scenarios is to prototype.
- Fine-tune your schema.xml to correctly map all the field types to the
minimum sized data typed in accordance with your requirements.
- assign an initial memory to the java process (you can start as low as
you like, even a few GB)
had very good luck with as much memory and as large of an ssd you can buy, and
setting the jvm xmx and Xms to exactly 31gb and letting the Linux server do
it’s own caching for the rest. 31 is a very specific number
> On Jun 13, 2021, at 5:28 PM, Syed Hasan wrote:
>
> Hi Guys,
> I'm brand new
Hi Guys,
I'm brand new to solr. I've been investigating the proper way to search
huge VCF file(s) in order to perform my bioinformatics analysis
functionality.
I came across SOLR. Sounds very interesting and promising. Before I dive
into it, I would like to know how much memory and hard-disk space