[
https://issues.apache.org/jira/browse/BEAM-10294?focusedWorklogId=461674&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-461674
]
ASF GitHub Bot logged work on BEAM-10294:
-----------------------------------------
Author: ASF GitHub Bot
Created on: 21/Jul/20 17:21
Start Date: 21/Jul/20 17:21
Worklog Time Spent: 10m
Work Description: ibzib commented on a change in pull request #12063:
URL: https://github.com/apache/beam/pull/12063#discussion_r458263018
##########
File path:
runners/spark/src/main/java/org/apache/beam/runners/spark/metrics/SparkMetricsContainerStepMap.java
##########
@@ -27,7 +27,7 @@
@Override
public String toString() {
- return new SparkBeamMetric().renderAll().toString();
+ return asAttemptedOnlyMetricResults(this).toString();
Review comment:
There are two ways of running Beam Python Flink/Spark. One way involves
starting a Java job server, the other (newer) way does not require a Java job
server and instead uses only Python. When using a Java job server, it's easy to
get metrics, since the job server has access to the Flink/Spark context object.
But in Python, we rely on the Flink REST API to get metrics, so the formatting
of the results displayed there is important.
The reason metrics formatting isn't a problem for Spark is that
spark_uber_jar_job_server.py just doesn't implement get_metrics yet.
The problem is that if we want to make `MetricsContainerStepMap::toString`
human-readable, we'll need to make the protobuf-formatted metrics accessible
somewhere else.
Perhaps there is a compromise though if we can somehow format metrics so
they are readable both by the proto parser and humans.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
Issue Time Tracking
-------------------
Worklog Id: (was: 461674)
Time Spent: 2h 50m (was: 2h 40m)
> Beam metrics are unreadable in Spark history server
> ---------------------------------------------------
>
> Key: BEAM-10294
> URL: https://issues.apache.org/jira/browse/BEAM-10294
> Project: Beam
> Issue Type: Bug
> Components: runner-spark
> Affects Versions: 2.13.0
> Reporter: David Janicek
> Assignee: David Janicek
> Priority: P2
> Attachments: image-2020-06-22-13-48-08-880.png
>
> Time Spent: 2h 50m
> Remaining Estimate: 0h
>
> Beam metrics shown in Spark history server are not readable. They're rendered
> as JSON which is created from protobuffer defined in *beam_job_api.proto*
> where metric's value is defined as bytes.
> !image-2020-06-22-13-48-08-880.png!
> Similar issue was already addressed and fixed in BEAM-6062 but was broken by
> BEAM-4552.
> Solution could be using *SparkMetricsContainerStepMap* instead of
> *MetricsContainerStepMap* inside *MetricsContainerStepMapAccumulator* as in
> BEAM-6062.
>
>
--
This message was sent by Atlassian Jira
(v8.3.4#803005)