This looks good to me at first blush. Please make sure this gets
captured in the wiki.

As far as the matric registry goes, if you use a singleton component
then you will guarantee each gets the same. Is that what you needed?

/** @component */
private MetricRegistry registry;

This is going to need to be in a common dependency of all the plugins
using this.

-  Brett



Mauro Botelho wrote:
> I'd like to propose a solution for this. Here's an attempt to describe
> what I was able to put together, but I'd like the opinion of the
> community before I get too deep :)
> 
> First of all, I'm following the approach that has been documented in the
> wiki so far. The main difference is that instead of persisting to a
> database right away, I decided to create a MetricRegistry interface.
> Implementer would then be free to decide what kind of persistence they'd
> like.
> 
> This MetricRegistry interface has only 2 method so far:
> 
> void reportMetric(Object location, String reportName, String metric,
>             Number value);
> 
> Number getMetric(Object location, String reportName, String metric)
>             throws MetricNotFoundException;
> 
> Location is an object that would reflect an "abstract project tree". In
> other words, for Java for example we can come up with a tree of:
> 
> project -> package -> file -> class -> line -> column
>                            -> line -> column
> 
> where plug ins would report metrics for.
> 
> ReportName/metric is the metric itself, for example the checkstyle plug
> in reports counts per severity, so we would have the metrics:
> 
> checkstyle:errors
> checkstyle:warnings
> checkstyle:info
> 
> and for PMD we could have
> 
> pmd:violation
> pmd:files
> 
> Value is the actual value of the metric. The reason why it is an Number
> right now is that it would allow a null value representing a not
> applicable kind of use case.
> 
> As for the parsers, the plug-ins themselves would either provide the
> parser as is the case right now with the checkstyle plug-in. They would
> be responsible for reporting the metrics to the registry. When/if a
> abstract project tree is created, plug-ins could be expected to report
> their metrics using locations determined by this APT. This would allow
> us to eventually create merged reports for statistics. Think of a report
> crossing checkstyle violations, coverage and volatility for each file in
> a project.
> 
> One could ask why not let plug-ins report their analysis against the APT
> and then have something come and traverse the tree to generate the
> report, separating the analysis from the report itself. That is a
> possibility, the problem is that we couple all the plug-ins against an
> implementation of a unified APT that doesn't exist today.
> 
> This solution would not address the Admin and Analyzer boxes in
> Vincent's diagram. Those could be fulfilled by other tools like
> suggested in the wiki. The registry persister could then save the
> information in a way that tools would be able to report on it.
> 
> Here are a few implementation questions I have.
> 
> I'd prefer that this metric registry be populated by the plug ins as
> they are run for the site plug-in report. The dashboard would then be a
> report that would run last and just be a HTML, DB, XML rendition of the
> metric registry. I'm not sure how this would work in a multi module build.
> 
> I think that a sensible approach would be to allow plug-ins to define a
> MetricRegistry property that would be configured in the pom and set by
> the maven engine. Would that break the reporting API contract for
> example? What is the best way to make sure that there's only one
> instance of the metric registry per "build".
> 
> Could we reuse the site plug-in to populate/report the metric registry,
> or would it be better to be a separate plug-in? I really would like to
> avoid running the same reports multiple times in the same build. The
> main reason for that is that we have a dysfunctional project which takes
> 15 mins to run its unit tests (more of an integration test really),
> today it takes 1h to run a site: 15 mins for the coverage report, 15
> mins for the test report for both the site and the dashboard-single
> reports.
> 
> 
> Mauro
> 
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [EMAIL PROTECTED]
> For additional commands, e-mail: [EMAIL PROTECTED]
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to