[ 
https://issues.apache.org/jira/browse/SPARK-33490?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17245287#comment-17245287
 ] 

Sean R. Owen commented on SPARK-33490:
--------------------------------------

You shouldn't use sys.exit() in general. Spark does inject shutdown hooks that 
do run on system.exit()  and I'm not sure if one of them would stop normal 
shutdown on purpose. I'm not sure there's a problem to solve though, as we 
generally don't want apps to kill the driver.

> calling system.exit(-1) from a future does not result in the driver 
> reflecting the fact that  the spark app failed.
> -------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-33490
>                 URL: https://issues.apache.org/jira/browse/SPARK-33490
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.4.0
>            Reporter: Srimantha Rajapaksa
>            Priority: Major
>
> Hi Guys,
> when calling system. exit(-1) from within a code block defined for a future 
> observed that the vm does not exit immediately as you would expect a java app 
> to do. in fact it seems to exit the future thread and the diver program 
> carries on as normal. the status of the app when it exits is also misleading. 
> It says FINISHED instead of FAILED. I have attached code sample see below) 
> that replicate this issue. is this a spark bug or a "feature" :)  any 
> thoughts?
> Note: I have been able to work around this by not calling system .exit(-1) 
> from within a future but this was definitely a gotcha that I thought 
> worthwhile raising..  
> {code:java}
> // code placeholder
> package com.spark.example
> import com.typesafe.config.Config
> import com.typesafe.scalalogging.Logger
> import org.apache.spark.rdd.RDD
> import org.apache.spark.sql.SparkSession
> import scala.concurrent.ExecutionContext.Implicits.global
> import scala.concurrent.duration.DurationInt
> import scala.concurrent.{Await, Future}
> import scala.util.{Failure, Success, Try}
> object SystemExitTestApp extends App {
>   val appName = "NumberCountSparkApp"
>   val logger = Logger(getClass)
>   val sparkMaster = "spark://adl-matspm01:7077"
>   val ss = SparkSession.builder()
>     .master(sparkMaster)
>     .appName(appName).getOrCreate()
>   logger.info(s"Starting test. thread id ${Thread.currentThread().getId}")
>   val data = Seq(1, 2, 3, 4)
>   val rdd: RDD[Int] = ss.sparkContext.parallelize(data)
>   val future = Future {
>     logger.info(s"inside future thread id ${Thread.currentThread().getId}")
>     Try {
>       throw new Exception("My dummy exception")
>       rdd.count()
>     } match {
>       case Success(_) => {
>         (true, 1)
>       }
>       case Failure(exception) => {
>         logger.info(s"inside future. Exception ${exception} thread id 
> ${Thread.currentThread().getId}")
>         (false, 0)
>       }
>     }
>   }
>   var result: Boolean = true
>   future onComplete {
>     case Success(x) => {
>       logger.info(s"success ${x}")
>       if (!x._1) {
>         logger.info(s"exiting in the success block of on complete. thread id 
> ${Thread.currentThread().getId}")
>         result = x._1
>         ss.close()
>         sys.exit(-1)
>       } else {
>         logger.info(s"not exiting in the success block of on complete. thread 
> id ${Thread.currentThread().getId}")
>         result = x._1
>       }
>     }
>     case Failure(exception) => {
>       logger.info(s"exiting in the failure block of on complete as exception 
> occurred ${exception}. thread id ${Thread.currentThread().getId}")
>       result = false
>       sys.exit(-1)
>     }
>   }
>   Await.result(future, 3 hours)
>   logger.info(s"should not see this really if we do lets use the result 
> outside. thread id ${Thread.currentThread().getId}")
>   if (!result) {
>     logger.info(s" inside main block should not see this really. thread id 
> ${Thread.currentThread().getId}")
>     //sys.exit(-1)
>   }
> }
> {code}
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to