Tri:
What is the volume of content (# of documents) and index size you are
expecting? What about the document complexity in terms of # of fields, what
are you storing in the index, complexity of the queries etc?

We have used SOLR with 10m documents with 1-3 second response times on the
front end  - this is with minimal tuning, 4-5 facet fields and large blobs
of content in the index and jRuby on Rails and complex queries and under low
load conditions (hence caches are probably not warmed much).

We have external search application almost fully powered by SOLR (except for
web crawl) and the response is of the typically less than 1 second with
about 100k documents. Solr time is probably 100-200 ms of this.

My sense is that SOLR is as fast as it gets and scales very, very well. On
the user group, I have seen reference to people using SOLR for 100m
documents or more. It would be useful to get your use case(s).





On Mon, Jan 3, 2011 at 10:44 AM, Jak Akdemir <jakde...@gmail.com> wrote:

> Hi,
> You can find benchmark results but these are not directly based on "index
> size vs. response time"
> http://wiki.apache.org/solr/SolrPerformanceData
>
> On Sat, Jan 1, 2011 at 4:06 AM, Tri Nguyen <tringuye...@yahoo.com> wrote:
>
> > Hi,
> >
> > I remember going through some page that had graphs of response times
> based
> > on index size for solr.
> >
> > Anyone know of such pages?
> >
> > Internally, we have some requirements for response times and I'm trying
> to
> > figure out when to shard the index.
> >
> > Thanks,
> >
> > Tri
>

Reply via email to