Repository: spark
Updated Branches:
  refs/heads/master 16fd2a2f4 -> c7b29ae64


[SPARK-10851] [SPARKR] Exception not failing R applications (in yarn cluster 
mode)

The YARN backend doesn't like when user code calls System.exit, since it cannot 
know the exit status and thus cannot set an appropriate final status for the 
application.

This PR remove the usage of system.exit to exit the RRunner. Instead, when the 
R process running an SparkR script returns an exit code other than 0, throws 
SparkUserAppException which will be caught by ApplicationMaster and 
ApplicationMaster knows it failed. For other failures, throws SparkException.

Author: Sun Rui <rui....@intel.com>

Closes #8938 from sun-rui/SPARK-10851.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/c7b29ae6
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/c7b29ae6
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/c7b29ae6

Branch: refs/heads/master
Commit: c7b29ae6418368a1266b960ba8776317fd867313
Parents: 16fd2a2
Author: Sun Rui <rui....@intel.com>
Authored: Wed Sep 30 11:03:08 2015 -0700
Committer: Andrew Or <and...@databricks.com>
Committed: Wed Sep 30 11:03:08 2015 -0700

----------------------------------------------------------------------
 core/src/main/scala/org/apache/spark/deploy/RRunner.scala | 10 +++++++---
 1 file changed, 7 insertions(+), 3 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/c7b29ae6/core/src/main/scala/org/apache/spark/deploy/RRunner.scala
----------------------------------------------------------------------
diff --git a/core/src/main/scala/org/apache/spark/deploy/RRunner.scala 
b/core/src/main/scala/org/apache/spark/deploy/RRunner.scala
index 05b954c..58cc1f9 100644
--- a/core/src/main/scala/org/apache/spark/deploy/RRunner.scala
+++ b/core/src/main/scala/org/apache/spark/deploy/RRunner.scala
@@ -25,6 +25,7 @@ import scala.collection.JavaConverters._
 import org.apache.hadoop.fs.Path
 
 import org.apache.spark.api.r.{RBackend, RUtils}
+import org.apache.spark.{SparkException, SparkUserAppException}
 import org.apache.spark.util.RedirectThread
 
 /**
@@ -84,12 +85,15 @@ object RRunner {
       } finally {
         sparkRBackend.close()
       }
-      System.exit(returnCode)
+      if (returnCode != 0) {
+        throw new SparkUserAppException(returnCode)
+      }
     } else {
+      val errorMessage = s"SparkR backend did not initialize in 
$backendTimeout seconds"
       // scalastyle:off println
-      System.err.println("SparkR backend did not initialize in " + 
backendTimeout + " seconds")
+      System.err.println(errorMessage)
       // scalastyle:on println
-      System.exit(-1)
+      throw new SparkException(errorMessage)
     }
   }
 }


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to