--- Comment #32 from Quim Gil <> ---
(In reply to Nemo from comment #31)
> (In reply to Alvaro from comment #30)
> >  min_days_for_review = 0.042 # one hour
> This I don't like. Speedy reviews are still reviews

Agreed, speedy reviews happen when there are at least two developers working
together fast. It is a factor to be considered in a project. It is also a
factor that might increase the differences between WMF results and the rest,
since it is easier to get two WMF employees in a same team working in such
conditions that, say, and independent developer waiting for the review of
someone with +2.

> If the median is still too skewed, use the 75th percentile as
> the WMF is doing... I don't remember where.

I think the pure median is good until someone comes with a better argument. The
graphs should be good enough to show trends over time, and differences between
affiliation and repositories.

> > We will also filter out "i18n" submissions.

One question, then. Do we need to filter explicitly changesets uploaded by
bots, or will they be automatically filtered if/when they are merged by
themselves? Or are the localization patches still merged manually?

But yes, in case of doubt let's remove bots from the equation. The purpose of
these community metrics is to analyze the performance of humans.

You are receiving this mail because:
You are on the CC list for the bug.
Wikibugs-l mailing list

Reply via email to