I tried anather test code: def main(args: Array[String]) {    if (args.length 
!= 1) {      Util.printLog("ERROR", "Args error - arg1: BASE_DIR")      
exit(101)     }    val currentFile = args(0).toString    val DB = "test_spark"  
  val tableName = "src"
    val sparkConf = new SparkConf().setAppName(s"HiveFromSpark")    val sc = 
new SparkContext(sparkConf)    val hiveContext = new HiveContext(sc)
    // Before exit    Util.printLog("INFO", "Exit")    exit(100)}
There were two `exit` in this code. If the args was wrong, the spark-submit 
will get the return code 101, but, if the args is correct, spark-submit cannot 
get the second return code 100.  What's the difference between these two 
`exit`? I was so confused.
From: lin_q...@outlook.com
To: u...@spark.incubator.apache.org
Subject: RE: Issue on [SPARK-3877][YARN]: Return code of the spark-submit in 
yarn-cluster mode
Date: Fri, 5 Dec 2014 17:11:39 +0800




I tried in spark client mode, spark-submit can get the correct return code from 
spark job. But in yarn-cluster mode, It failed.

From: lin_q...@outlook.com
To: u...@spark.incubator.apache.org
Subject: Issue on [SPARK-3877][YARN]: Return code of the spark-submit in 
yarn-cluster mode
Date: Fri, 5 Dec 2014 16:55:37 +0800




Hi, all:
According to https://github.com/apache/spark/pull/2732, When a spark job fails 
or exits nonzero in yarn-cluster mode, the spark-submit will get the 
corresponding return code of the spark job. But I tried in spark-1.1.1 yarn 
cluster, spark-submit return zero anyway.
Here is my spark code:
    try {      val dropTable = s"drop table $DB.$tableName"      
hiveContext.hql(dropTable)      val createTbl =  do some thing...      
hiveContext.hql(createTbl)    } catch {      case ex: Exception => {        
Util.printLog("ERROR", s"create db error.")        exit(-1)      }    }
Maybe I did something wrong. Is there any hint? Thanks.                         
                                                                                
                  

Reply via email to