In recent history, we've been looking each week at stats from http://russellbryant.net/openstack-stats/tripleo-openreviews.html to get a gauge on how our review pipeline is tracking.
The main stats we've been tracking have been the "since the last revision without -1 or -2". I've included some history at , but the summary is that our 3rd quartile has slipped from 13 days to 16 days over the last 4 weeks or so. Our 1st quartile is fairly steady lately, around 1 day (down from 4 a month ago) and median is unchanged around 7 days. There was lots of discussion in our last meeting about what could be causing this. However, the thing we wanted to bring to the list for the discussion is: Are we tracking the right metric? Should we be looking to something else to tell us how well our pipeline is performing? The meeting logs have quite a few suggestions about ways we could tweak the existing metrics, but if we're measuring the wrong thing that's not going to help. I think that what we are looking for is a metric that lets us know whether the majority of patches are getting feedback quickly. Maybe there's some other metric that would give us a good indication? --------  Current "Stats since the last revision without -1 or -2" : Average wait time: 10 days, 17 hours, 6 minutes 1st quartile wait time: 1 days, 1 hours, 36 minutes Median wait time: 7 days, 5 hours, 33 minutes 3rd quartile wait time: 16 days, 8 hours, 16 minutes At last week's meeting we had: 3rd quartile wait time: 15 days, 13 hours, 47 minutes A week before that: 3rd quartile wait time: 13 days, 9 hours, 11 minutes The week before that was the mid-cycle, but the week before that: 19:53:38 <lifeless> Stats since the last revision without -1 or -2 : 19:53:38 <lifeless> Average wait time: 10 days, 17 hours, 49 minutes 19:53:38 <lifeless> 1st quartile wait time: 4 days, 7 hours, 57 minutes 19:53:38 <lifeless> Median wait time: 7 days, 10 hours, 52 minutes 19:53:40 <lifeless> 3rd quartile wait time: 13 days, 13 hours, 25 minutes  Some of the things suggested as potential causes of the long 3rd median times: * We have small number of really old reviews that have only positive scores but aren't being landed * Some reviews get a -1 but then sit for a long time waiting for the author to reply * We have some really old reviews that suddenly get revived after a long period being in WIP or abandoned, which reviewstats seems to miscount * Reviewstats counts weekends, we don't (so a change that gets pushed at 5pm US Friday and gets reviewed at 9am Aus Monday would be seen by us as having no wait time, but by reviewstats as ~36 hours)
_______________________________________________ OpenStack-dev mailing list OpenStackfirstname.lastname@example.org http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev