mosche commented on code in PR #22157:
URL: https://github.com/apache/beam/pull/22157#discussion_r925702807
##########
runners/spark/src/main/java/org/apache/beam/runners/spark/metrics/AggregatorMetric.java:
##########
@@ -33,7 +41,35 @@ public static AggregatorMetric of(final NamedAggregators
namedAggregators) {
return new AggregatorMetric(namedAggregators);
}
- NamedAggregators getNamedAggregators() {
- return namedAggregators;
+ @Override
+ public Map<String, Gauge<Double>> getValue(String prefix, MetricFilter
filter) {
+ Map<String, Gauge<Double>> metrics = new HashMap<>();
+ for (Map.Entry<String, ?> entry : namedAggregators.renderAll().entrySet())
{
+ String name = prefix + "." + entry.getKey();
+ Object rawValue = entry.getValue();
+ if (rawValue == null) {
+ continue;
+ }
+ try {
+ Gauge<Double> gauge = staticGauge(rawValue);
+ if (filter.matches(name, gauge)) {
+ metrics.put(name, gauge);
+ }
+ } catch (NumberFormatException e) {
+ LOG.warn(
+ "Metric `{}` of type {} can't be reported, conversion to double
failed.",
+ name,
+ rawValue.getClass().getSimpleName(),
+ e);
+ }
+ }
+ return metrics;
+ }
+
+ // Metric type is assumed to be compatible with Double
Review Comment:
Yes 👍 Among the know supported subtypes of `Metric` only `Gauge` and
`MetricRegistry` fit. `MetricRegistry` would be even better, but also require a
lot more changes. Spark 3 has a plugin framework which would simplify a lot of
this (no custom sinks needed anymore + configuration nightmare for users). When
migrating to a metrics plugin, this needs to turn into a registry. One thing at
a time ;)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]