One recurring topic in our weekly meetings over the last few months has
been the fact that it's taking us longer and longer to review and land
patches. Right now,
has the following stats:

   - Stats since the latest revision:
      1. Average wait time: 10 days, 14 hours, 37 minutes
      2. 1st quartile wait time: 4 days, 8 hours, 45 minutes
      3. Median wait time: 7 days, 10 hours, 50 minutes
      4. 3rd quartile wait time: 15 days, 9 hours, 57 minutes
      5. Number waiting more than 7 days: 63
   - Stats since the last revision without -1 or -2 :
      1. Average wait time: 11 days, 22 hours, 21 minutes
      2. 1st quartile wait time: 4 days, 10 hours, 55 minutes
      3. Median wait time: 8 days, 3 hours, 49 minutes
      4. 3rd quartile wait time: 18 days, 23 hours, 20 minutes

 There are many things that can contribute to this; for instance, a patch
that has no negative reviews but also can't be approved (because it isn't
passing CI, or because it depends on other changes that are more
contentious) will increase these average wait times. I kicked off a
discussion about the possibility that we're measuring the wrong thing a few
weeks ago (
- if you've got suggestions about what we should be measuring, please
follow up there.

There's one trend I'm seeing that seems as though it's exacerbating our
sluggishness: many of our core reviewers are not meeting their commitment
to 3 reviews per day.
currently shows just 9 cores (and 2 non-cores!) meeting have exceeded 60
reviews over the last 30 days; 10 cores have not[1]. This isn't just a
short-term glitch either: the 90-day stats show the same numbers, although
a slightly different set of reviewers on either side of the cutoff.

The commitment to 3 reviews per day is one of the most important things we
ask of our core reviewers. We want their reviews to be good quality to help
make sure our code is also good quality - but if they can't commit to 3
reviews per day, they are slowing the project down by making it harder for
even good quality code to land. There's little point in us having good
quality code if it's perpetually out of date.

I'd like to call on all existing cores to make a concerted effort to meet
the commitment they made when they were accepted as core reviewers. We need
to ensure that patches are being reviewed and landed, and we can't do that
unless cores take the time to do reviews. If you aren't able to meet the
commitment you made when you became a core, it would be helpful if you
could let us know - or undertake a meta-review so we can consider possibly
adding some new cores (it seems like we might have 2 strong candidates, if
the quality of their reviews matches the quantity)

I'd like to call on everyone else (myself included - I've only managed 13
reviews over the last 30 days!) to help out as well. It's easier for cores
to review patchsets that already have several +1s. If you're looking for a
place to start, has
a list of the oldest patches that are looking for reviews, and has a link to a
dashboard that has sections for reviews that have been a long time without

If you have ideas for other things we can or should measure, please follow
up on the other thread, or in our weekly meeting.

[1] lxsli and jonpaul-sullivan are cores since the conclusion of the thread
starting at,
but not shown as such. will correct
OpenStack-dev mailing list

Reply via email to