[
https://issues.apache.org/jira/browse/BEAM-10294?focusedWorklogId=457144&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-457144
]
ASF GitHub Bot logged work on BEAM-10294:
-----------------------------------------
Author: ASF GitHub Bot
Created on: 10/Jul/20 12:52
Start Date: 10/Jul/20 12:52
Worklog Time Spent: 10m
Work Description: davidak09 commented on a change in pull request #12063:
URL: https://github.com/apache/beam/pull/12063#discussion_r452823257
##########
File path:
runners/spark/src/main/java/org/apache/beam/runners/spark/metrics/SparkMetricsContainerStepMap.java
##########
@@ -27,7 +27,7 @@
@Override
public String toString() {
- return new SparkBeamMetric().renderAll().toString();
+ return asAttemptedOnlyMetricResults(this).toString();
Review comment:
We ran our application built with Beam `2.18.0` and the metrics are
displayed correctly, with this fix Spark displays the same results.
The change that broke things up for Spark was part of BEAM-9600 - #11369 .
##########
File path:
runners/spark/src/main/java/org/apache/beam/runners/spark/metrics/MetricsAccumulator.java
##########
@@ -58,13 +57,13 @@ public static void init(SparkPipelineOptions opts,
JavaSparkContext jsc) {
opts.isStreaming()
? Optional.of(new CheckpointDir(opts.getCheckpointDir()))
: Optional.absent();
- MetricsContainerStepMap metricsContainerStepMap = new
MetricsContainerStepMap();
+ SparkMetricsContainerStepMap metricsContainerStepMap = new
SparkMetricsContainerStepMap();
Review comment:
You're right, it's enough to change it just in initialization. I fixed
it.
I will try to add a regression test.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
Issue Time Tracking
-------------------
Worklog Id: (was: 457144)
Time Spent: 1.5h (was: 1h 20m)
> Beam metrics are unreadable in Spark history server
> ---------------------------------------------------
>
> Key: BEAM-10294
> URL: https://issues.apache.org/jira/browse/BEAM-10294
> Project: Beam
> Issue Type: Bug
> Components: runner-spark
> Affects Versions: 2.13.0
> Reporter: David Janicek
> Assignee: David Janicek
> Priority: P2
> Attachments: image-2020-06-22-13-48-08-880.png
>
> Time Spent: 1.5h
> Remaining Estimate: 0h
>
> Beam metrics shown in Spark history server are not readable. They're rendered
> as JSON which is created from protobuffer defined in *beam_job_api.proto*
> where metric's value is defined as bytes.
> !image-2020-06-22-13-48-08-880.png!
> Similar issue was already addressed and fixed in BEAM-6062 but was broken by
> BEAM-4552.
> Solution could be using *SparkMetricsContainerStepMap* instead of
> *MetricsContainerStepMap* inside *MetricsContainerStepMapAccumulator* as in
> BEAM-6062.
>
>
--
This message was sent by Atlassian Jira
(v8.3.4#803005)