On Thu, Dec 22, 2011 at 7:02 AM, Zoran | Bax-shop.nl <
zoran.bi...@bax-shop.nl> wrote:

> Hello,
>
> What are (ballpark figure) the hardware requirement (diskspace, memory)
> SOLR will use i this case:
>
>
> *         Heavy Dutch traffic webshop, 30.000 - 50.000 visitors a day
>

Unique users doesn't much matter.


> *         Visitors relying heavily on the search engine of the site
> o   3.000.000 - 5.000.000 searches a day
>

This is what matters.

Assume 20,000 seconds per day (less than the real number by 4x, but allows
for peak rates).  That gives about 250 queries / second.

Is this rate growing?


> *         Around 20.000 products to be indexed. In an XML this is around
> 22 MB in size
> o   Around 100-200 products that will need reindexing everyday because of
> copyrighters
>

This is small enough to not much matter.


> *         About 20 fields to be indexed per document (product)
> *         Using many features of SOLR
> o   Boosting queries
> o   Faceted search (price ranges, categories, in stock, etc.)
> o   Spellchecker
> o   Suggester (completion)
> o   Phonectic search
>

Just make sure that you are serving search results from memory, not disk.


> The current index directory is around 20 MB, but that's my testing
> environment. On my testing server indexing the 20K documents took under 10
> seconds.
>

Nice.


> I tried to be as comprehensive as possible with these specs. Hopefully
> it's enough to make an estimation.
>

So the next step is to build a test rig and see how many queries per second
each server will handle.  Since your index is small, this should be pretty
easy.  The required rate of 250 queries/s should be pretty easy to achieve.
 Nothing will substitute for a real test here.

You should make sure you have staging / spare hardware and room to grow if
necessary.

Reply via email to