At the summit there were lots of discussions about reviews, and I made the mistake of sending a mail to Russell proposing a few new stats we could gather.
I say mistake, because he did so and then some... we new have extra info - consider: http://russellbryant.net/openstack-stats/nova-reviewers-30.txt There are two new things: In each row a new column 'received' - this counts the number of incoming reviews to each reviewer. It's use should be obvious - but remember that folk who contribute the occasional patch probably don't have the context to be doing reviews... those who are contributing many patches and getting many incoming reviews however... This gives us the philanthropists (or perhaps team supporters...) | klmitch ** | 137 0 19 0 118 9 86.1% | 14 ( 11.9%) | 0 (∞) | (the bit at the end is a unicode infinity... we'll need to work on that). And so on :) Down the bottom of the page: Total reviews: 2980 (99.3/day) Total reviewers: 241 Total reviews by core team: 1261 (42.0/day) Core team size: 17 New patch sets in the last 30 days: 1989 (66.3/day) and for 90 days: Total reviews: 10705 (118.9/day) Total reviewers: 406 Total reviews by core team: 5289 (58.8/day) Core team size: 17 New patch sets in the last 90 days: 7515 (83.5/day) This is the really interesting bit. Remembering that every patch needs - at minimum - 2 +2's, the *minimum* viable core team review rate to keep up is patch sets per day * 2: 30 days: 132 core reviews/day 90 days: 167 core reviews/day But we're getting: 30 days: 42/day or 90/day short 90 days: 59/day or 108/day short One confounding factor here is that this counts (AIUI) pushed changes, not change ids - so we don't need two +2's for every push, we need two +2's for every changeid - we should add that as a separate metric I think, as the needed +2 count will be a bit lower. Anyhow, to me - this is circling in nicely on having excellent information (vs data) on the review status, and from there we can start to say 'right, to keep up, Nova needs N core reviewers consistently doing Y reviews per day. If Y is something sensible like 3 or 4, we can work backwards. Using the current figures (which since we don't have changeId as a separate count are a little confounded) that would give us: time period reviews/core/day core-team-size 30 days 3 44 30 days 4 33 30 days 8 17 90 days 3 56 90 days 4 42 90 days 10 17 Also note that these are calendar days, so no weekends or leave for -core! What else.... in the last 30 days core have done 42% of reviews, in the last 90 days 49%. So thats getting better. I know Russell has had concerns about core cohesion in the past, but I don't think doing 8 detailed reviews every day including weekends is individually sustainable. IMO we badly need more core reviewers.... and that means: - 20 or so volunteers - who step up and do - pick a number - say 3 - reviews a day, every work day, like clockwork. - and follow up on their reviewed patches to learn what other reviewers say, and why - until the nova-core team & Russell are happy that they can contribute effectively as -core. Why such a big number of volunteers? Because we need a big number of people to spread load, because Nova has a high incoming patch rate. Cheers, Rob -- Robert Collins <[email protected]> Distinguished Technologist HP Converged Cloud _______________________________________________ OpenStack-dev mailing list [email protected] http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev
