Stefano Maffulli <> writes:

> On Thu, 2015-02-26 at 15:58 -0600, Kevin L. Mitchell wrote:
>> One thing that comes to mind is that there are a lot of reviews that
>> appear to have been abandoned; I just cleared several from the
>> novaclient review queue (or commented on them to see if they were still
>> alive).  I also know of a few novaclient changes that are waiting for
>> corresponding nova changes before they can be merged.  Could these be
>> introducing a skew factor?
> Maybe, depending on how many they are and how old are we talking about.
> How much cruft is there? Maybe the fact that we don't autoabandon
> anymore is a relevant factor?
> Looking at Nova time to merge (not the client, since clients are not
> analyzed individually), the median is over 10 days (the mean wait is
> 29). But if you look at the trends of time to way for reviewers, they've
> been trending down for 3 quarters in a row (both, average and median)
> while time to wait for submitter is trending up.
> Does it make sense to purge old stuff regularly so we have a better
> overview? Or maybe we should chart a distribution of age of proposed
> changesets, too in order to get a better understanding of where the
> outliers are?

It is good to recognize the impact of this, however, I would suggest
that if having open changes that are not "actively being worked" is a
problem for statistics, let's change the statistics calculation.  Please
do not abandon the work of contributors to improve the appearance of
these metrics.  Instead, simply decide what criteria you think should
apply and exclude those changes from your calculations.


OpenStack Development Mailing List (not for usage questions)

Reply via email to