+1, visualizing the number of ignored tests in a graph seems useful. Even better with some slices (e.g. per runner, module, ...).

On 5/12/20 8:02 PM, Ahmet Altay wrote:
+1 to generate a report instead of removing these tests. A report like this could help us with prioritization. It is easier to address issues when we can quantify how much of a problem it is.

I am curious what we can do to incentivize reducing the number of flaky/ignored tests? A report itself might provide incentive, it is rewarding to see ignored tests numbers go down over time.

On Mon, May 11, 2020 at 8:30 AM Luke Cwik <lc...@google.com <mailto:lc...@google.com>> wrote:

    Deleting ignored tests does lead us to losing the reason as to why
    the test case was around so I would rather keep it around. I think
    it would be more valuable to generate a report that goes on the
    website/wiki showing stability of the modules (num tests, num
    passed, num skipped, num failed (running averages over the past N
    runs)). We had discussed doing something like this for
    ValidatesRunner so we could show which runner supports what
    automatically.

    On Mon, May 11, 2020 at 12:53 AM Jan Lukavský <je...@seznam.cz
    <mailto:je...@seznam.cz>> wrote:

        I think that we do have Jira issues for ignored test, there
        should be no problem with that. The questionable point is that
        when test gets Ignored, people might consider the problem as
        "less painful" and postpone the correct solution until ...
        forever. I'd just like to discuss if people see this as an
        issue. If yes, should we do something about that, or if no,
        maybe we can create a rule that test marked as Ignored for
        long time might be deleted, because apparently is only a dead
        code.

        On 5/6/20 6:30 PM, Kenneth Knowles wrote:
        Good point.

        The raw numbers are available in the test run output. See
        
https://builds.apache.org/view/A-D/view/Beam/view/PostCommit/job/beam_PreCommit_Java_Cron/2718/testReport/
 for
        the "skipped" column.
        And you get the same on console or Gradle Scan:
        https://scans.gradle.com/s/ml3jv5xctkrmg/tests?collapse-all
        This would be good to review periodically for obvious trouble
        spots.

        But I think you mean something more detailed. Some report
        with columns: Test Suite, Test Method, Jira, Date Ignored,
        Most Recent Update

        I think we can get most of this from Jira, if we just make
        sure that each ignored test has a Jira and they are all
        labeled in a consistent way. That would be the quickest way
        to get some result, even though it is not perfectly automated
        and audited.

        Kenn

        On Tue, May 5, 2020 at 2:41 PM Jan Lukavský <je...@seznam.cz
        <mailto:je...@seznam.cz>> wrote:

            Hi,

            it seems we are accumulating test cases (see discussion
            in [1]) that are
            marked as @Ignored (mostly due to flakiness), which is
            generally
            undesirable. Associated JIRAs seem to be open for a long
            time, and this
            might generally cause that we loose code coverage. Would
            anyone have
            idea on how to visualize these Ignored tests better? My
            first idea would
            be something similar to "Beam dependency check report",
            but that seems
            to be not the best example (which is completely different
            issue :)).

            Jan

            [1] https://github.com/apache/beam/pull/11614

Reply via email to