davidak09 commented on a change in pull request #12063:
URL: https://github.com/apache/beam/pull/12063#discussion_r452823257



##########
File path: 
runners/spark/src/main/java/org/apache/beam/runners/spark/metrics/SparkMetricsContainerStepMap.java
##########
@@ -27,7 +27,7 @@
 
   @Override
   public String toString() {
-    return new SparkBeamMetric().renderAll().toString();
+    return asAttemptedOnlyMetricResults(this).toString();

Review comment:
       We ran our application built with Beam `2.18.0` and the metrics are 
displayed correctly, with this fix Spark displays the same results.
   
   The change that broke things up for Spark was part of BEAM-9600 - #11369 .

##########
File path: 
runners/spark/src/main/java/org/apache/beam/runners/spark/metrics/MetricsAccumulator.java
##########
@@ -58,13 +57,13 @@ public static void init(SparkPipelineOptions opts, 
JavaSparkContext jsc) {
               opts.isStreaming()
                   ? Optional.of(new CheckpointDir(opts.getCheckpointDir()))
                   : Optional.absent();
-          MetricsContainerStepMap metricsContainerStepMap = new 
MetricsContainerStepMap();
+          SparkMetricsContainerStepMap metricsContainerStepMap = new 
SparkMetricsContainerStepMap();

Review comment:
       You're right, it's enough to change it just in initialization. I fixed 
it.
   
   I will try to add a regression test.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to