[ 
https://issues.apache.org/jira/browse/SOLR-16525?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17629882#comment-17629882
 ] 

Gus Heck commented on SOLR-16525:
---------------------------------

Regularized performance measurement is definitely something we need.

It's been on my list of things I'd like to work on for a very long time, but I 
keep finding myself too busy. The graph is interesting, can you provide more 
detail on what the scope of this is? Are you going to measure start/stop times? 
Indexing throughput? Query Latency? If indexing is involved, what tool are you 
using. My prior thoughts along this line were that it should start from an 
indexing of the same data used for Lucene's performance graphs. If we mirror 
their tests, that would give us an idea of where Solr added significant 
overhead as well as detecting where we are introduce slow downs or make 
improvements. Certainly we also have additional cases to test like streaming 
expressions, TRA's which should get tested too, but my thought was to cover the 
parity with Lucene's tests first.

However all that might be far beyond or different than what you are attempting 
here?

> Periodic performance benchmarking
> ---------------------------------
>
>                 Key: SOLR-16525
>                 URL: https://issues.apache.org/jira/browse/SOLR-16525
>             Project: Solr
>          Issue Type: Task
>      Security Level: Public(Default Security Level. Issues are Public) 
>            Reporter: Ishan Chattopadhyaya
>            Priority: Major
>
> This issue is about periodic performance testing of Solr commits against 
> pre-written suites of performance tests.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to