Nice question! O would like to know some solutions in this fileld too.

On Sun, Jun 29, 2008 at 1:04 PM, Jacob Singh <[EMAIL PROTECTED]> wrote:

> Hi folks,
>
> Does anyone have any bright ideas on how to benchmark solr?  Unless
> someone has something better, here is what I am thinking:
>
> 1. Have a config file where one can specify info like how many docs, how
> large, how many facets, and how many updates / searches per minute
>
> 2. Use one of the various client APIs to generate XML files for updates
> using some kind of lorem ipsum text as a base and store them in a dir.
>
> 3. Use siege to set the update run at whatever interval is specified in
> the config, sending an update every x seconds and removing it from the
> directory
>
> 4. Generate a list of search queries based upon the facets created, and
> build a urls.txt with all of these search urls
>
> 5. Run the searches through siege
>
> 6. Monitor the output using nagios to see where load kicks in.
>
> This is not that sophisticated, and feels like it won't really pinpoint
> bottlenecks, but would aproximately tell us where a server will start to
> bail.
>
> Does anyone have any better ideas?
>
> Best,
> Jacob Singh
>



-- 
Hugo Pessoa de Baraúna

"Se vc faz tudo igual a todo mundo, não pode esperar resultados diferentes."

http://hugobarauna.blogspot.com/

Reply via email to