Github user jiangxb1987 commented on a diff in the pull request:
https://github.com/apache/spark/pull/21898#discussion_r207823603
--- Diff: core/src/main/scala/org/apache/spark/BarrierCoordinator.scala ---
@@ -0,0 +1,230 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark
+
+import java.util.{Timer, TimerTask}
+import java.util.concurrent.ConcurrentHashMap
+import java.util.function.Consumer
+
+import scala.collection.mutable.ArrayBuffer
+
+import org.apache.spark.internal.Logging
+import org.apache.spark.rpc.{RpcCallContext, RpcEnv, ThreadSafeRpcEndpoint}
+import org.apache.spark.scheduler.{LiveListenerBus, SparkListener,
SparkListenerStageCompleted}
+
+/**
+ * For each barrier stage attempt, only at most one barrier() call can be
active at any time, thus
+ * we can use (stageId, stageAttemptId) to identify the stage attempt
where the barrier() call is
+ * from.
+ */
+private case class ContextBarrierId(stageId: Int, stageAttemptId: Int) {
+ override def toString: String = s"Stage $stageId (Attempt
$stageAttemptId)"
+}
+
+/**
+ * A coordinator that handles all global sync requests from
BarrierTaskContext. Each global sync
+ * request is generated by `BarrierTaskContext.barrier()`, and identified
by
+ * stageId + stageAttemptId + barrierEpoch. Reply all the blocking global
sync requests upon
+ * all the requests for a group of `barrier()` calls are received. If the
coordinator is unable to
+ * collect enough global sync requests within a configured time, fail all
the requests and return
+ * an Exception with timeout message.
+ */
+private[spark] class BarrierCoordinator(
+ timeoutInSecs: Long,
+ listenerBus: LiveListenerBus,
+ override val rpcEnv: RpcEnv) extends ThreadSafeRpcEndpoint with
Logging {
+
+ private lazy val timer = new Timer("BarrierCoordinator barrier epoch
increment timer")
--- End diff --
This is certainly a potential bug in `SparkSubmit` and not related to the
changes made in this PR, I don't feel it shall block this PR.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]