Re: [openstack-dev] [stable][all] Revisiting the 6 month release cycle [metrics]

2015-02-27 Thread James E. Blair
Stefano Maffulli  writes:

> In any case, since Sean said that nova (and other projects) already
> remove unmergeable changesets regularly, I think the data are already
> "clean enough" to give us food for thoughts.

I am asking you to please independently remove changes that you don't
think should be considered from your metrics.  If you rely on Sean or
others to abandon changes, then you are, in essence, relying on core
reviewers abandoning changes for the purposes of providing "clean" (as
you put it) input to a metrics system.

I think abandoning changes so that the metrics look the way we want is a
terrible experience for contributors.

Especially as it appears some projects, such as nova, are in a position
where they are actually leaving -2 votes on changes which will not be
lifted for 2 or 3 months.  That means that if someone runs a script like
Sean's, these changes will be abandoned, yet there is nothing that the
submitter can do to progress the change in the mean time.  Abandoning
such a review is making an already bad experience for contributors even
worse.

-Jim

__
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev


Re: [openstack-dev] [stable][all] Revisiting the 6 month release cycle [metrics]

2015-02-27 Thread Stefano Maffulli
On Thu, 2015-02-26 at 16:44 -0800, James E. Blair wrote:
> It is good to recognize the impact of this, however, I would suggest
> that if having open changes that are not "actively being worked" is a
> problem for statistics,

I don't think it's a problem for the statistics per se. The reports are
only a tool to analyze complex phenomenons and translate them into
manageable items. In fact, we keep adding more stats as we go because
every chart and table leaves us with more questions.

>  let's change the statistics calculation.  Please do not abandon the
> work of contributors to improve the appearance of
> these metrics.  Instead, simply decide what criteria you think should
> apply and exclude those changes from your calculations.

I'm currently thinking that it would be informative to plot the
distribution of the efficiency metrics, instead of simply come up with a
filter to ignore long standing changes with slow/null activity over some
arbitrary amount of time. I think it would be more interesting to see
how many 'inactive' vs 'active' there are at a given time.

In any case, since Sean said that nova (and other projects) already
remove unmergeable changesets regularly, I think the data are already
"clean enough" to give us food for thoughts.

Why owners seem to be getting slower and slower to provide new patches,
despite the fact that the number of patches per changeset is fairly
stable? I'll look into the data more carefully with Daniel Izquierdo as
I think there are huge outliers skewing the data (the diff between
median and average is huge).

/stef


__
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev


Re: [openstack-dev] [stable][all] Revisiting the 6 month release cycle [metrics]

2015-02-27 Thread Kyle Mestery
On Fri, Feb 27, 2015 at 4:02 AM, Daniel P. Berrange 
wrote:

> On Fri, Feb 27, 2015 at 09:51:34AM +1100, Michael Still wrote:
> > On Fri, Feb 27, 2015 at 9:41 AM, Stefano Maffulli 
> wrote:
> >
> > > Does it make sense to purge old stuff regularly so we have a better
> > > overview? Or maybe we should chart a distribution of age of proposed
> > > changesets, too in order to get a better understanding of where the
> > > outliers are?
> >
> > Given the abandon of a review isn't binding (a proposer can easily
> > unabandon), I do think we should abandon more than we do now. The
> > problem at the moment being that its a manual process which isn't much
> > fun for the person doing the work.
> >
> > Another factor to consider here is that abandoned patches against bugs
> > make the bug look like someone is working on a fix, which probably
> > isn't the case.
> >
> > Nova has been trying some very specific things to try and address
> > these issues, and I think we're improving. Those things are:
> >
> > * specs
> > * priority features
>
> This increased level of process in Nova has actually made the negative
> effects of the 6 month cycle noticably worse on balance. If you aren't
> able to propose your feature in the right window of the dev cycle your
> chances of getting stuff merged has gone down significantly and the time
> before users are likely to see your feature has correspondingly gone up.
> Previously people could come along with simple features at the end of
> the cycle and we had the flexibility to be pragmmatic and review and
> approve them. Now we're lacking them that ability even if we have the
> spare review cycles to consider it. The processes adopted have merely
> made us more efficient at disappointing contributors earlier in the
> cycle. There's been no changes made that would  solve the bigger problem
> of the fact that Nova is far too large vs the size of the core review
> team, so we have a ongoing major bottleneck in our development. That,
> bottleneck combined with the length of the 6 month cycle is an ongoing
> disaster for our contributors.
>
> This is part of the reason we have moved to split Neutron into smaller,
bite-sized chunk repositories with sometimes overlapping core reviewer
teams. It's also why we're spinning out the backend logic from in-tree
drivers and plugins to allow faster iteration for the maintainers. Early
evidence indicates this has been succesful, we'll see how it looks once we
get into the Liberty development cycle.

For a bit more context, you can see the blog I wrote on this [1].

Thanks,
Kyle

[1]
http://www.siliconloons.com/posts/2015-02-26-scaling-openstack-neutron-development/

Regards,
> Daniel
> --
> |: http://berrange.com  -o-http://www.flickr.com/photos/dberrange/
> :|
> |: http://libvirt.org  -o- http://virt-manager.org
> :|
> |: http://autobuild.org   -o- http://search.cpan.org/~danberr/
> :|
> |: http://entangle-photo.org   -o-   http://live.gnome.org/gtk-vnc
> :|
>
> __
> OpenStack Development Mailing List (not for usage questions)
> Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
> http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev
>
__
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev


Re: [openstack-dev] [stable][all] Revisiting the 6 month release cycle [metrics]

2015-02-27 Thread Sean Dague
On 02/26/2015 05:41 PM, Stefano Maffulli wrote:
> On Thu, 2015-02-26 at 15:58 -0600, Kevin L. Mitchell wrote:
>> One thing that comes to mind is that there are a lot of reviews that
>> appear to have been abandoned; I just cleared several from the
>> novaclient review queue (or commented on them to see if they were still
>> alive).  I also know of a few novaclient changes that are waiting for
>> corresponding nova changes before they can be merged.  Could these be
>> introducing a skew factor?
> 
> Maybe, depending on how many they are and how old are we talking about.
> How much cruft is there? Maybe the fact that we don't autoabandon
> anymore is a relevant factor?
> 
> Looking at Nova time to merge (not the client, since clients are not
> analyzed individually), the median is over 10 days (the mean wait is
> 29). But if you look at the trends of time to way for reviewers, they've
> been trending down for 3 quarters in a row (both, average and median)
> while time to wait for submitter is trending up.
> 
> http://git.openstack.org/cgit/openstack-infra/activity-board/plain/reports/2014-q4/pdf/projects/nova.pdf
> 
> Does it make sense to purge old stuff regularly so we have a better
> overview? Or maybe we should chart a distribution of age of proposed
> changesets, too in order to get a better understanding of where the
> outliers are?

We already purge old stuff that's unmergable (no activity in > 4 weeks
with either a core -2 or Jenkins -1). The last purge was about 4 weeks
ago. So effectively abandoned code isn't in the system.

The merge conflict detector will also mean that all patches eventually
get a Jenkins -1 if they aren't maintained. So you should consider
everything in the system active for some definition.

-Sean

-- 
Sean Dague
http://dague.net

__
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev


Re: [openstack-dev] [stable][all] Revisiting the 6 month release cycle [metrics]

2015-02-27 Thread Daniel P. Berrange
On Fri, Feb 27, 2015 at 09:51:34AM +1100, Michael Still wrote:
> On Fri, Feb 27, 2015 at 9:41 AM, Stefano Maffulli  
> wrote:
> 
> > Does it make sense to purge old stuff regularly so we have a better
> > overview? Or maybe we should chart a distribution of age of proposed
> > changesets, too in order to get a better understanding of where the
> > outliers are?
> 
> Given the abandon of a review isn't binding (a proposer can easily
> unabandon), I do think we should abandon more than we do now. The
> problem at the moment being that its a manual process which isn't much
> fun for the person doing the work.
> 
> Another factor to consider here is that abandoned patches against bugs
> make the bug look like someone is working on a fix, which probably
> isn't the case.
> 
> Nova has been trying some very specific things to try and address
> these issues, and I think we're improving. Those things are:
> 
> * specs
> * priority features

This increased level of process in Nova has actually made the negative
effects of the 6 month cycle noticably worse on balance. If you aren't
able to propose your feature in the right window of the dev cycle your
chances of getting stuff merged has gone down significantly and the time
before users are likely to see your feature has correspondingly gone up.
Previously people could come along with simple features at the end of
the cycle and we had the flexibility to be pragmmatic and review and
approve them. Now we're lacking them that ability even if we have the
spare review cycles to consider it. The processes adopted have merely
made us more efficient at disappointing contributors earlier in the
cycle. There's been no changes made that would  solve the bigger problem
of the fact that Nova is far too large vs the size of the core review
team, so we have a ongoing major bottleneck in our development. That,
bottleneck combined with the length of the 6 month cycle is an ongoing
disaster for our contributors.

Regards,
Daniel
-- 
|: http://berrange.com  -o-http://www.flickr.com/photos/dberrange/ :|
|: http://libvirt.org  -o- http://virt-manager.org :|
|: http://autobuild.org   -o- http://search.cpan.org/~danberr/ :|
|: http://entangle-photo.org   -o-   http://live.gnome.org/gtk-vnc :|

__
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev


Re: [openstack-dev] [stable][all] Revisiting the 6 month release cycle [metrics]

2015-02-26 Thread James E. Blair
Stefano Maffulli  writes:

> On Thu, 2015-02-26 at 15:58 -0600, Kevin L. Mitchell wrote:
>> One thing that comes to mind is that there are a lot of reviews that
>> appear to have been abandoned; I just cleared several from the
>> novaclient review queue (or commented on them to see if they were still
>> alive).  I also know of a few novaclient changes that are waiting for
>> corresponding nova changes before they can be merged.  Could these be
>> introducing a skew factor?
>
> Maybe, depending on how many they are and how old are we talking about.
> How much cruft is there? Maybe the fact that we don't autoabandon
> anymore is a relevant factor?
>
> Looking at Nova time to merge (not the client, since clients are not
> analyzed individually), the median is over 10 days (the mean wait is
> 29). But if you look at the trends of time to way for reviewers, they've
> been trending down for 3 quarters in a row (both, average and median)
> while time to wait for submitter is trending up.
>
> http://git.openstack.org/cgit/openstack-infra/activity-board/plain/reports/2014-q4/pdf/projects/nova.pdf
>
> Does it make sense to purge old stuff regularly so we have a better
> overview? Or maybe we should chart a distribution of age of proposed
> changesets, too in order to get a better understanding of where the
> outliers are?

It is good to recognize the impact of this, however, I would suggest
that if having open changes that are not "actively being worked" is a
problem for statistics, let's change the statistics calculation.  Please
do not abandon the work of contributors to improve the appearance of
these metrics.  Instead, simply decide what criteria you think should
apply and exclude those changes from your calculations.

-Jim

__
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev


Re: [openstack-dev] [stable][all] Revisiting the 6 month release cycle [metrics]

2015-02-26 Thread Michael Still
On Fri, Feb 27, 2015 at 9:41 AM, Stefano Maffulli  wrote:

> Does it make sense to purge old stuff regularly so we have a better
> overview? Or maybe we should chart a distribution of age of proposed
> changesets, too in order to get a better understanding of where the
> outliers are?

Given the abandon of a review isn't binding (a proposer can easily
unabandon), I do think we should abandon more than we do now. The
problem at the moment being that its a manual process which isn't much
fun for the person doing the work.

Another factor to consider here is that abandoned patches against bugs
make the bug look like someone is working on a fix, which probably
isn't the case.

Nova has been trying some very specific things to try and address
these issues, and I think we're improving. Those things are:

* specs
* priority features
* trivial patch monkeying

We will continue to seek further ways to improve our throughput.

Michael

-- 
Rackspace Australia

__
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev


Re: [openstack-dev] [stable][all] Revisiting the 6 month release cycle [metrics]

2015-02-26 Thread Stefano Maffulli
On Thu, 2015-02-26 at 14:18 -0600, Anne Gentle wrote:
> Do the features listed in the Release Notes each have appropriate
> documentation? So far we just link to the specifications for nova, for
> example. [1] So to me, it could be a focus on the specification
> acceptance means less time/energy for the actual user-facing docs.
> 
Great question. I have no idea of how we can correlate blueprints/specs
completed and the existence of accompanying doc. The tool we have now
scans git repositories and launchpad (will soon scan storyboard, too):
what data is in there that this tool can use to guess documentation
coverage?

Happy to brainstorm this with you and Bitergia's Jesus and Daniel as
they may have ideas.

/stef



__
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev


Re: [openstack-dev] [stable][all] Revisiting the 6 month release cycle [metrics]

2015-02-26 Thread Stefano Maffulli
On Thu, 2015-02-26 at 15:58 -0600, Kevin L. Mitchell wrote:
> One thing that comes to mind is that there are a lot of reviews that
> appear to have been abandoned; I just cleared several from the
> novaclient review queue (or commented on them to see if they were still
> alive).  I also know of a few novaclient changes that are waiting for
> corresponding nova changes before they can be merged.  Could these be
> introducing a skew factor?

Maybe, depending on how many they are and how old are we talking about.
How much cruft is there? Maybe the fact that we don't autoabandon
anymore is a relevant factor?

Looking at Nova time to merge (not the client, since clients are not
analyzed individually), the median is over 10 days (the mean wait is
29). But if you look at the trends of time to way for reviewers, they've
been trending down for 3 quarters in a row (both, average and median)
while time to wait for submitter is trending up.

http://git.openstack.org/cgit/openstack-infra/activity-board/plain/reports/2014-q4/pdf/projects/nova.pdf

Does it make sense to purge old stuff regularly so we have a better
overview? Or maybe we should chart a distribution of age of proposed
changesets, too in order to get a better understanding of where the
outliers are?

/stef


__
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev


Re: [openstack-dev] [stable][all] Revisiting the 6 month release cycle [metrics]

2015-02-26 Thread Kevin L. Mitchell
On Thu, 2015-02-26 at 11:45 -0800, Stefano Maffulli wrote:
> The interesting bit of those charts is that overall for OpenStack
> projects, it seems that the reviews (comments to patchsets) are arriving
> quite quickly but the new patchsets take a lot more to be submitted. 
> 
> Too much debating and commenting over each patch? Or are the
> authors/owners of the changeset slow to respond with new patches? I
> don't have an answer. I'd be happy to look at the data with other
> people.

One thing that comes to mind is that there are a lot of reviews that
appear to have been abandoned; I just cleared several from the
novaclient review queue (or commented on them to see if they were still
alive).  I also know of a few novaclient changes that are waiting for
corresponding nova changes before they can be merged.  Could these be
introducing a skew factor?
-- 
Kevin L. Mitchell 
Rackspace


__
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev


Re: [openstack-dev] [stable][all] Revisiting the 6 month release cycle [metrics]

2015-02-26 Thread Ed Leafe
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

On 02/26/2015 01:45 PM, Stefano Maffulli wrote:

> The interesting bit of those charts is that overall for OpenStack
> projects, it seems that the reviews (comments to patchsets) are arriving
> quite quickly but the new patchsets take a lot more to be submitted. 
> 
> Too much debating and commenting over each patch? Or are the
> authors/owners of the changeset slow to respond with new patches? I
> don't have an answer. I'd be happy to look at the data with other
> people.

I would think that the expansion of the spec requirements is a major
contributor to this trend. Much of the debate that used to happen with
patch sets now happens on the spec, which delays the first submission of
a patch, and shortens the time for review, since reviewers are already
familiar with the changes being proposed.

- -- Ed Leafe
-BEGIN PGP SIGNATURE-
Version: GnuPG v2.0.14 (GNU/Linux)

iQIcBAEBAgAGBQJU74gVAAoJEKMgtcocwZqLKqMP/iW63bOpu64iCV+cTwwUbO1f
IK2Jou+W8zFXjKaK2Q6q0sU5nN0RlgJOXVuKxdZk1VQZknkUm3S53/JxCJPWEanp
RLavJZUNnWmmxYoFcmSIx3gCi1idunEZUSHIKy/QQ1P9BCiej1szSskqP2BxTxBo
tqLCCVtRKcmH+U4zvjyuNhJhRoZiPyOiPtxg1vS9VqEK3hyxTtt4QPI0TfxL3cGC
yZPhE15R/rnWErgJx4mD9Orz8eUlUrqctDQxd/+XXobj9WiyUxjqBtW3/vOVJTcU
AtwgLvsiXKmHPBGXST09Ow3Ab5wWjxq3GvAXTKLtiu7xjkWuKK2KuLxJeS3r0yOs
HC6QdoZO6tytbhLRFEwIn98rfm63i5wWuL0GBRPxBW+HOnEcBm87L1Se8gPl9paD
clT4OIwabzoZNyyIeTaSWukxEeVA9EEhssyctlFFQlqoRRmyWdfUwC8I/v/z0UF8
q4T7XpRyuCVuqEO23LSRW5U74sfiaffJ3YhMNhVUfY9ECOSRpzMHt6GJbvnvzS8c
ug4VbNl7y06BXpPznukD42S7E2+uUjkbxPViN3BLDfmrKPb8G236fQY+6I+YTeyP
rWl6ZW0ArZOrYNcNyZ3zBwV3cjvJcxQdBkJwr4zkq4jg/fCcrNeYZKjxJbWYh3tz
HTBhA6Eaoim9n7bjba4M
=6Psn
-END PGP SIGNATURE-

__
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev


Re: [openstack-dev] [stable][all] Revisiting the 6 month release cycle [metrics]

2015-02-26 Thread Anne Gentle
On Thu, Feb 26, 2015 at 1:45 PM, Stefano Maffulli 
wrote:

> On Wed, 2015-02-25 at 21:15 +, Ian Cordasco wrote:
> > I read it the same was as Doug. I don’t think Jeremy was trying to
> > imply your reviews would move through more quickly if you reviewed
> > other people’s work. Just that, as with most open source projects,
> > there’s always at least 2 distinct groups: people who push code more
> > often and people who review code more often.
>
> this conversation reminded me that the median time to merge new code has
> been increasing every quarter in the past year, but dropped for the
> first time during last quarter (table and chart on page 23 of Q4
> quarterly report [1]). The mean number of iterations (patchsets) per
> proposed change has also decreased for the first time in Q4 2014.
>
> The interesting bit of those charts is that overall for OpenStack
> projects, it seems that the reviews (comments to patchsets) are arriving
> quite quickly but the new patchsets take a lot more to be submitted.
>
> Too much debating and commenting over each patch? Or are the
> authors/owners of the changeset slow to respond with new patches? I
> don't have an answer. I'd be happy to look at the data with other
> people.
>
> I think more analysis is needed before we can identify and remove the
> problem.
>
> /Stef
>
>  [1]
>
> http://git.openstack.org/cgit/openstack-infra/activity-board/plain/reports/2014-q4/pdf/2014-q4_OpenStack_report.pdf
> The analysis doesn't count the *-specs repositories, only code and docs.
>
>
Thanks for the analysis Stef. One additional analysis I would like to see
is this:

Do the features listed in the Release Notes each have appropriate
documentation? So far we just link to the specifications for nova, for
example. [1] So to me, it could be a focus on the specification acceptance
means less time/energy for the actual user-facing docs.

What could I do to analyze and correlate the feature completeness to doc
completeness? A desire to release docs with code is great but we don't seem
to be doing so in the way that I define docs being done.

I've worked with Agile teams in the past, and you have to put your
definition of done for docs in with the code. Often times, it's "release
notes only" in the definition of done. So I think we're working at that
level of "done" for docs, instead of "end user docs complete" or "API docs
complete" or "config docs complete" or "administration documented" as the
marker of complete for release.

We've seen with the python-clients that the docs are very much
complained about. The release cadence doesn't really give docs a chance to
do anything but scrape the help text to automate a collection of
information about each command in the CLI Reference. [2] That's one
solution but one that serves speed rather than quality. Users tell me
they'd prefer to have commands for use cases and scenarios in the docs,
which is closer to our End User Guide. [3] But even the End User Guide
could be improved by showing examples of what's returned for each command.
We've only got about 10% coverage there.

We're doing everything we can to make it easier to contribute to docs by
migrating to RST,[4] so let's please consider an energy transfer from specs
acceptance to end-user docs, especially during feature freeze time.

Kept meaning to note the difficulties for docs on the other thread, but
Stef's email reminds me we need more analysis as well.
Thanks,
Anne

1.
https://wiki.openstack.org/wiki/ReleaseNotes/Juno#OpenStack_Compute_.28Nova.29
2. http://docs.openstack.org/cli-reference/content/
3. http://docs.openstack.org/user-guide/content/
4. http://justwriteclick.com/2015/02/23/state-of-the-migration-to-sphinxrst/


> PS The analysis of the individual projects are in their own pdf
>
> http://git.openstack.org/cgit/openstack-infra/activity-board/tree/reports/2014-q4/pdf/projects
>
>
>
> __
> OpenStack Development Mailing List (not for usage questions)
> Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
> http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev
>



-- 
Anne Gentle
annegen...@justwriteclick.com
__
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev


Re: [openstack-dev] [stable][all] Revisiting the 6 month release cycle [metrics]

2015-02-26 Thread Stefano Maffulli
On Wed, 2015-02-25 at 21:15 +, Ian Cordasco wrote:
> I read it the same was as Doug. I don’t think Jeremy was trying to
> imply your reviews would move through more quickly if you reviewed
> other people’s work. Just that, as with most open source projects,
> there’s always at least 2 distinct groups: people who push code more
> often and people who review code more often. 

this conversation reminded me that the median time to merge new code has
been increasing every quarter in the past year, but dropped for the
first time during last quarter (table and chart on page 23 of Q4
quarterly report [1]). The mean number of iterations (patchsets) per
proposed change has also decreased for the first time in Q4 2014.

The interesting bit of those charts is that overall for OpenStack
projects, it seems that the reviews (comments to patchsets) are arriving
quite quickly but the new patchsets take a lot more to be submitted. 

Too much debating and commenting over each patch? Or are the
authors/owners of the changeset slow to respond with new patches? I
don't have an answer. I'd be happy to look at the data with other
people.

I think more analysis is needed before we can identify and remove the
problem.

/Stef

 [1]
http://git.openstack.org/cgit/openstack-infra/activity-board/plain/reports/2014-q4/pdf/2014-q4_OpenStack_report.pdf
The analysis doesn't count the *-specs repositories, only code and docs.

PS The analysis of the individual projects are in their own pdf
http://git.openstack.org/cgit/openstack-infra/activity-board/tree/reports/2014-q4/pdf/projects



__
OpenStack Development Mailing List (not for usage questions)
Unsubscribe: openstack-dev-requ...@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev