[
https://issues.apache.org/jira/browse/SOLR-10317?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16038003#comment-16038003
]
Vivek Narang commented on SOLR-10317:
-------------------------------------
Hi [~ichattopadhyaya] I think there is some confusion and I think I should give
some explanation.
- I am not building a new framework, but I am extending the benchmarks
framework that you created [https://github.com/chatman/solr-upgrade-tests] as
mentioned in the proposal.
- The reason behind me extending upon your framework is that it already has
many flexible, ready to use resources and that it is written in one language. I
am comfortable using one language over two languages together.
- For the remaining things, I already am in the process of using the required
resources from Shalin's work and adding to the framework that you created.
- As far as tagging/addition of significant events go while I think that the
current logic in Shalin's code base related to listing significant events is
hard coded
[https://github.com/shalinmangar/solr-perf-tools/blob/master/src/python/bench.py#L32-L87],
the closest that I am in making it dynamic and self-dependent is showing
relevant commit messages with each metric point
[http://212.47.227.9/prod/NumericQueryBenchmarkStandalone.html] (please hover
over any point to see the relevant commit message).
- I will try to add a feature through which you would be able to view all the
graphs together.
Please access the latest codebase
[https://github.com/viveknarang/lucene-solr/tree/SolrNightlyBenchmarks/dev-tools/SolrNightBenchmarks].
As per the agreement in the proposal, the code for benchmarks suite is under
dev-tools framework and in the SolrNightlyBenchmarks branch.
Regards
> Solr Nightly Benchmarks
> -----------------------
>
> Key: SOLR-10317
> URL: https://issues.apache.org/jira/browse/SOLR-10317
> Project: Solr
> Issue Type: Task
> Reporter: Ishan Chattopadhyaya
> Labels: gsoc2017, mentor
> Attachments: changes-lucene-20160907.json,
> changes-solr-20160907.json, managed-schema,
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks.docx,
> Narang-Vivek-SOLR-10317-Solr-Nightly-Benchmarks-FINAL-PROPOSAL.pdf,
> solrconfig.xml
>
>
> Solr needs nightly benchmarks reporting. Similar Lucene benchmarks can be
> found here, https://home.apache.org/~mikemccand/lucenebench/.
> Preferably, we need:
> # A suite of benchmarks that build Solr from a commit point, start Solr
> nodes, both in SolrCloud and standalone mode, and record timing information
> of various operations like indexing, querying, faceting, grouping,
> replication etc.
> # It should be possible to run them either as an independent suite or as a
> Jenkins job, and we should be able to report timings as graphs (Jenkins has
> some charting plugins).
> # The code should eventually be integrated in the Solr codebase, so that it
> never goes out of date.
> There is some prior work / discussion:
> # https://github.com/shalinmangar/solr-perf-tools (Shalin)
> # https://github.com/chatman/solr-upgrade-tests/blob/master/BENCHMARKS.md
> (Ishan/Vivek)
> # SOLR-2646 & SOLR-9863 (Mark Miller)
> # https://home.apache.org/~mikemccand/lucenebench/ (Mike McCandless)
> # https://github.com/lucidworks/solr-scale-tk (Tim Potter)
> There is support for building, starting, indexing/querying and stopping Solr
> in some of these frameworks above. However, the benchmarks run are very
> limited. Any of these can be a starting point, or a new framework can as well
> be used. The motivation is to be able to cover every functionality of Solr
> with a corresponding benchmark that is run every night.
> Proposing this as a GSoC 2017 project. I'm willing to mentor, and I'm sure
> [~shalinmangar] and [[email protected]] would help here.
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]