[ 
https://issues.apache.org/jira/browse/BEAM-11213?focusedWorklogId=576071&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-576071
 ]

ASF GitHub Bot logged work on BEAM-11213:
-----------------------------------------

                Author: ASF GitHub Bot
            Created on: 02/Apr/21 12:47
            Start Date: 02/Apr/21 12:47
    Worklog Time Spent: 10m 
      Work Description: iemejia commented on a change in pull request #14409:
URL: https://github.com/apache/beam/pull/14409#discussion_r606220672



##########
File path: 
runners/spark/src/main/java/org/apache/beam/runners/spark/util/SparkCommon.java
##########
@@ -0,0 +1,79 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.beam.runners.spark.util;
+
+import java.net.URI;
+import java.net.URISyntaxException;
+import org.apache.beam.runners.spark.SparkPipelineOptions;
+import org.apache.beam.sdk.annotations.Internal;
+import org.apache.spark.api.java.JavaSparkContext;
+import org.apache.spark.scheduler.EventLoggingListener;
+import org.apache.spark.scheduler.SparkListenerExecutorAdded;
+import org.apache.spark.scheduler.cluster.ExecutorInfo;
+import org.checkerframework.checker.nullness.qual.Nullable;
+import scala.Tuple2;
+
+/** Common methods to build Spark specific objects used by different runners. 
*/
+@Internal
+@SuppressWarnings({
+  "nullness" // TODO(https://issues.apache.org/jira/browse/BEAM-10402)
+})
+public class SparkCommon {
+
+  /**
+   * Starts an EventLoggingListener to save Beam Metrics on Spark's History 
Server if event logging
+   * is enabled.
+   *
+   * @return The associated EventLoggingListener or null if it could not be 
started.
+   */
+  public static @Nullable EventLoggingListener startEventLoggingListener(
+      final JavaSparkContext jsc, SparkPipelineOptions pipelineOptions, long 
startTime) {
+    EventLoggingListener eventLoggingListener = null;
+    try {
+      if (jsc.getConf().getBoolean("spark.eventLog.enabled", false)) {

Review comment:
       This is the [official Spark way to 
configure](https://spark.apache.org/docs/latest/configuration.html#spark-ui) 
what the `eventLogEnabled` and `sparkHistoryDir` options were doing. So better 
to use these.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Issue Time Tracking
-------------------

    Worklog Id:     (was: 576071)
    Time Spent: 12h 10m  (was: 12h)

> Beam metrics should be displayed in Spark UI
> --------------------------------------------
>
>                 Key: BEAM-11213
>                 URL: https://issues.apache.org/jira/browse/BEAM-11213
>             Project: Beam
>          Issue Type: Wish
>          Components: runner-spark
>            Reporter: Kyle Weaver
>            Assignee: Tomasz Szerszen
>            Priority: P2
>              Labels: portability-spark
>             Fix For: 2.29.0
>
>          Time Spent: 12h 10m
>  Remaining Estimate: 0h
>
> All Beam metrics are visible in the Spark UI in a single accumulator value 
> (in the "Accumulators" tab), which is a large, hard-to-read blob. Originally, 
> this blob was rendered in a bespoke format 
> (https://github.com/apache/beam/blob/ead80b469ffeeddcd8e9e5c8dc462eec0b0ffc6b/sdks/java/core/src/main/java/org/apache/beam/sdk/metrics/MetricQueryResults.java#L63-L72).
>  I changed the format to JSON so it could be easily deserialized (BEAM-9600). 
> But then an issue was filed (BEAM-10294) reporting that the new JSON format 
> was harder to read than the original bespoke format. The temporary fix was to 
> revert to the bespoke format in Spark, while allowing Flink to continue to 
> use JSON. However, if Beam metrics are only visible as an accumulator, then 
> they are also unreadable because the payloads are in binary form (BEAM-10719).
> Having metrics visible in Spark's "Metrics" tab would A) make metrics easier 
> to read (even compared to the bespoke accumulator string format), and closer 
> to what users of Beamless Spark expect, and B) free us to use the accumulator 
> however we wish for Beam internal purposes, without worrying about 
> readability.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to