[ 
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16014142#comment-16014142
 ] 

Ishan Chattopadhyaya edited comment on SOLR-10317 at 5/17/17 2:56 PM:
----------------------------------------------------------------------

bq. The other part of test metadata is test parameters, such as index # of 
threads, etc. The test result is meaningful with these parameters.
+1. Also, for the indexing graph, other meaningful information include number 
of shards, replicas, type of SolrJ client used, batch size.

bq. I have added an option to see the state of memory during a specific test. 
Memory profile looks good. Please use some human readable scale for Y-Axis and 
please specify the units. Also, how are you accessing the memory usage?

bq. I was thinking that It would be nice to show the author information (Name, 
email, and date) for the commit (In case the viewer needs to reach out to 
him/her for some performance related questions). 
Good to show as much information as possible, without cluttering the interface. 
However, I would prefer just -the commit ID, name, date and the SOLR- jira 
issue (email is useless)- the commit IDs and links in this format: 
https://git-wip-us.apache.org/repos/asf?p=lucene-solr.git;h=7830462d4b7da3acefff6353419e71cde62d5fee.
 Keep in mind that if this suite is run once a day, then each data point would 
be a result of all the intermediate commits between that day's run and previous 
day's run. Is it possible to include information about all the intermediate 
commits in the display?


was (Author: ichattopadhyaya):
bq. The other part of test metadata is test parameters, such as index # of 
threads, etc. The test result is meaningful with these parameters.
+1. Also, for the indexing graph, other meaningful information include number 
of shards, replicas, type of SolrJ client used, batch size.

bq. I have added an option to see the state of memory during a specific test. 
Memory profile looks good. Please use some human readable scale for Y-Axis and 
please specify the units. Also, how are you accessing the memory usage?

bq. I was thinking that It would be nice to show the author information (Name, 
email, and date) for the commit (In case the viewer needs to reach out to 
him/her for some performance related questions). 
Good to show as much information as possible, without cluttering the interface. 
However, I would prefer just the commit ID, name, date and the SOLR- jira issue 
(email is useless). Keep in mind that if this suite is run once a day, then 
each data point would be a result of all the intermediate commits between that 
day's run and previous day's run. Is it possible to include information about 
all the intermediate commits in the display?

> Solr Nightly Benchmarks
> -----------------------
>
>                 Key: SOLR-10317
>                 URL: https://issues.apache.org/jira/browse/SOLR-10317
>             Project: Solr
>          Issue Type: Task
>            Reporter: Ishan Chattopadhyaya
>              Labels: gsoc2017, mentor
>         Attachments: changes-lucene-20160907.json, 
> changes-solr-20160907.json, managed-schema, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, 
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, 
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be 
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr 
> nodes, both in SolrCloud and standalone mode, and record timing information 
> of various operations like indexing, querying, faceting, grouping, 
> replication etc.
> # It should be possible to run them either as an independent suite or as a 
> Jenkins job, and we should be able to report timings as graphs (Jenkins has 
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it 
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md 
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr 
> in some of these frameworks above. However, the benchmarks run are very 
> limited. Any of these can be a starting point, or a new framework can as well 
> be used. The motivation is to be able to cover every functionality of Solr 
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure 
> [~shalinmangar] and [~markrmil...@gmail.com] would help here.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org

Reply via email to