[ https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16102215#comment-16102215 ]
Ishan Chattopadhyaya commented on SOLR-10317: --------------------------------------------- Thanks Vivek. As per our offline discussion, I can see that you have collapsed the "serial" benchmarks into the "concurrent" benchmarks with a threads=1 line. Looks much less confusing now! Can you please document the steps on how to run the benchmarks for past n days (not past n commits), with just one commit per day? Even if the steps are manual in nature, please document them. Going forward, it would be good to have some parameters to let us do that (perhaps also allow for a time of the day). Also, I suggest that top bar menu be very simple: Standalone Metrics and Cloud Metrics. Both could link to the "merged" view (better call it "unified" view). Each of the graphs can be linked to the individual pages, and each page could have a back button that takes the user back to the "unified" page. Also, can you please list down the major TODO items? > Solr Nightly Benchmarks > ----------------------- > > Key: SOLR-10317 > URL: https://issues.apache.org/jira/browse/SOLR-10317 > Project: Solr > Issue Type: Task > Reporter: Ishan Chattopadhyaya > Labels: gsoc2017, mentor > Attachments: changes-lucene-20160907.json, > changes-solr-20160907.json, managed-schema, > Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx, > Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf, > SOLR-10317.patch, SOLR-10317.patch, solrconfig.xml > > > Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be > found here, https://home.apache.org/~mikemccand/lucenebench/. > Preferably, we need: > # A suite of benchmarks that build Solr from a commit point, start Solr > nodes, both in SolrCloud and standalone mode, and record timing information > of various operations like indexing, querying, faceting, grouping, > replication etc. > # It should be possible to run them either as an independent suite or as a > Jenkins job, and we should be able to report timings as graphs (Jenkins has > some charting plugins). > # The code should eventually be integrated in the Solr codebase, so that it > never goes out of date. > There is some prior work / discussion: > # https://github.com/shalinmangar/solr-perf-tools (Shalin) > # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md > (Ishan/Vivek) > # SOLR-2646 & SOLR-9863 (Mark Miller) > # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless) > # https://github.com/lucidworks/solr-scale-tk (Tim Potter) > There is support for building, starting, indexing/querying and stopping Solr > in some of these frameworks above. However, the benchmarks run are very > limited. Any of these can be a starting point, or a new framework can as well > be used. The motivation is to be able to cover every functionality of Solr > with a corresponding benchmark that is run every night. > Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure > [~shalinmangar] and [~markrmil...@gmail.com] would help here. -- This message was sent by Atlassian JIRA (v6.4.14#64029) --------------------------------------------------------------------- To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org For additional commands, e-mail: dev-h...@lucene.apache.org