not really. It's a matter of when your system starts to bog down, and
unfortunately
there's no good way to give general guidance, especially on a number
like size of
the index. 90% of the index size could be stored data (*.fdt and *.fdx
files) that have
no bearing on search requirements.....

My advice would be to set up a test system and keep adding documents
to it until it
blows up. You can fire queries you mine from the Solr logs at to
simulate load. You
may have to synthesize documents to get a good sense of this....

But a 1G index is actually quite small by many standards. Of course it
depends on your
hardware...

Best
Erick

On Mon, Jul 16, 2012 at 6:05 AM, Alexander Aristov
<alexander.aris...@gmail.com> wrote:
> People,
>
> What would be your suggestion?
>
> I have a basic solr installation. Index is becoming bigger and bigger and
> it hit 1Gb level.
>
> When shall I consider adding shards and split index over them? are there
> general suggestions?
>
>
> Best Regards
> Alexander Aristov

Reply via email to