[
https://issues.apache.org/jira/browse/SOLR-10032?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Mark Miller updated SOLR-10032:
-------------------------------
Attachment: Test-Report-Sample.pdf
I'm still building the first report, but here is a partial sample attached. At
the moment, the output of my script is a tsv file and I just paste that into
Google spreadsheets.
> Create report to assess Solr test quality at a commit point.
> ------------------------------------------------------------
>
> Key: SOLR-10032
> URL: https://issues.apache.org/jira/browse/SOLR-10032
> Project: Solr
> Issue Type: Task
> Security Level: Public(Default Security Level. Issues are Public)
> Components: Tests
> Reporter: Mark Miller
> Assignee: Mark Miller
> Attachments: Test-Report-Sample.pdf
>
>
> We have many Jenkins instances blasting tests, some official, some policeman,
> I and others have or had their own, and the email trail proves the power of
> the Jenkins cluster to find test fails.
> However, I still have a very hard time with some basic questions:
> what tests are flakey right now? which test fails actually affect devs most?
> did I break it? was that test already flakey? is that test still flakey? what
> are our worst tests right now? is that test getting better or worse?
> We really need a way to see exactly what tests are the problem, not because
> of OS or environmental issues, but more basic test quality issues. Which
> tests are flakey and how flakey are they at any point in time.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]