Repository: spark
Updated Branches:
  refs/heads/master 25bba58d1 -> 017cdf2be


[SPARK-13711][CORE] Don't call SparkUncaughtExceptionHandler in AppClient as 
it's in driver

## What changes were proposed in this pull request?

AppClient runs in the driver side. It should not call `Utils.tryOrExit` as it 
will send exception to SparkUncaughtExceptionHandler and call `System.exit`. 
This PR just removed `Utils.tryOrExit`.

## How was this patch tested?

manual tests.

Author: Shixiong Zhu <shixi...@databricks.com>

Closes #11566 from zsxwing/SPARK-13711.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/017cdf2b
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/017cdf2b
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/017cdf2b

Branch: refs/heads/master
Commit: 017cdf2be67776978a940609d610afea79856b17
Parents: 25bba58
Author: Shixiong Zhu <shixi...@databricks.com>
Authored: Mon Mar 7 20:56:08 2016 -0800
Committer: Shixiong Zhu <shixi...@databricks.com>
Committed: Mon Mar 7 20:56:08 2016 -0800

----------------------------------------------------------------------
 .../apache/spark/deploy/client/AppClient.scala    | 18 ++++++++----------
 1 file changed, 8 insertions(+), 10 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/017cdf2b/core/src/main/scala/org/apache/spark/deploy/client/AppClient.scala
----------------------------------------------------------------------
diff --git a/core/src/main/scala/org/apache/spark/deploy/client/AppClient.scala 
b/core/src/main/scala/org/apache/spark/deploy/client/AppClient.scala
index a7a0a78..b9dec62 100644
--- a/core/src/main/scala/org/apache/spark/deploy/client/AppClient.scala
+++ b/core/src/main/scala/org/apache/spark/deploy/client/AppClient.scala
@@ -125,16 +125,14 @@ private[spark] class AppClient(
       registerMasterFutures.set(tryRegisterAllMasters())
       registrationRetryTimer.set(registrationRetryThread.schedule(new Runnable 
{
         override def run(): Unit = {
-          Utils.tryOrExit {
-            if (registered.get) {
-              registerMasterFutures.get.foreach(_.cancel(true))
-              registerMasterThreadPool.shutdownNow()
-            } else if (nthRetry >= REGISTRATION_RETRIES) {
-              markDead("All masters are unresponsive! Giving up.")
-            } else {
-              registerMasterFutures.get.foreach(_.cancel(true))
-              registerWithMaster(nthRetry + 1)
-            }
+          if (registered.get) {
+            registerMasterFutures.get.foreach(_.cancel(true))
+            registerMasterThreadPool.shutdownNow()
+          } else if (nthRetry >= REGISTRATION_RETRIES) {
+            markDead("All masters are unresponsive! Giving up.")
+          } else {
+            registerMasterFutures.get.foreach(_.cancel(true))
+            registerWithMaster(nthRetry + 1)
           }
         }
       }, REGISTRATION_TIMEOUT_SECONDS, TimeUnit.SECONDS))


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to