I was also looking at ways to measure metrics. This is necessary to
make sure that we do a good job at code reviews and it's good for
process improvements. Without measures we have no way to know if the
time spent reviewing code is worth the effort.
So I second Hovanes in saying that there is a need for this kind of
feature.

I like your approach for measurements. They are non-intrusive for the
users which is great.

Additionally I was wondering if we should add a way to classify the
type of comments (defects, coding styling, bad practice, ...). I'm not
sure if it is possible without driving the developers and reviewers
crazy, though :-)

Cheers,
-Herve


On Oct 22, 5:35 pm, "Christian Hammond" <[EMAIL PROTECTED]> wrote:
> This goes back to a discussion that was had in the very early days of Review
> Board with one of my co-workers. He recommended actually keeping track on
> the page of the time spent. It would basically work like this:
>
> * On scrolling in the screenshot or diff viewer page, start a timer.
> * On idling (no cursor or keyboard input) for a minute or two, pause the
> timer.
> * Keep the timer running while a comment dialog is up.
>
> This would give you roughly the amount of time spent actually looking at
> code.
>
> We never added this because there really wasn't a compelling reason for it,
> and it's some additional overhead that we didn't feel was too necessary.
>
> If there's a big demand for it, and someone wants to write a patch, we could
> probably include it in the codebase. Another option would be to make this an
> extension down the road when we roll out extension support (after 1.0, which
> should be released in December).
>
> Does this sound like it would work for your needs?
>
> Christian
>
> --
> Christian Hammond - [EMAIL PROTECTED]
> VMware, Inc.
>
> On Wed, Oct 22, 2008 at 3:28 PM, H M <[EMAIL PROTECTED]> wrote:
> >  We would like to see if we can gather some metrics from Reviewboard,
> > including:
>
> >    - *Inspection Rate:* The number of KLOC inspected per hour.
> >    - *Defect Identification Rate:* The number of defects found in one hour
> >    of inspection time.
> >    - *Defect Density:* The amount of defects found per KLOC.
>
> > We need 3 numbers to determine the above metrics:
>
> >    - *LOC*: Can easily get this from our repository since we mandate
> >    association between Review and Commit in the commit log and we require 
> > all
> >    code to go through ReviewBoard.
> >     - *Defect Count*: We can assume that Comments in RB are actual defects
> >    discovered during review (unless they are started by the submitter
> >    him/herself in which case they are "author annotations".)
> >    - *Review Time*:  ?????
>
> > So the only thing which we are unsure about is the review time?
>
> > We are thinking of the following two methods:
>
> >    - *Review_Duration* = *Time_of_Ship_It_Comment*    -    *
> >    Review_Submission_Time*
> >     - *Review_Duration* = *Time_of_Ship_It_Comment*    -    *Time of First
> >    Comment*
>
> > But the numbers above aren't exact. In some cases a reviewer might start
> > looking at the code immediately, but not comment or they may be trying to
> > code on their machine. Other times, the review may stay untouched overnight
> > (if it was submitted too late in the day)  and might be looked at only the
> > following morning.
>
> > Does anyone else have suggestions or has anyone done something like this
> > before?
>
> > Hovanes
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"reviewboard" group.
To post to this group, send email to reviewboard@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/reviewboard?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to