Hi all,

I am running a Spark (v1.6.1) application using the ./bin/spark-submit
script. I made some changes to the HttpBroadcast module. However, after the
application finishes completely, the spark master program hangs at the end
of the application. The ShutdownHook is supposed to be called at this
point.

I am wondering what is the condition to trigger the
SparkShutdownHook.runAll() and where is the related code.


2017/05/23 14:47:26.030 INFO TaskSchedulerImpl: Removed TaskSet 6.0, whose
tasks have all completed, from pool
2017/05/23 14:47:26.030 INFO DAGScheduler: ResultStage 6 (saveAsTextFile at
package.scala:169) finished in 1.180 s
2017/05/23 14:47:26.030 DEBUG DAGScheduler: After removal of stage 5,
remaining stages = 1
2017/05/23 14:47:26.030 DEBUG DAGScheduler: After removal of stage 6,
remaining stages = 0
2017/05/23 14:47:26.030 INFO DAGScheduler: Job 3 finished: saveAsTextFile
at package.scala:169, took 1.265678 s
*(***the following should have happened, but the master hangs here without
calling the runAll in SparkShutdownHook***)*
2017/05/23 14:47:26.065 INFO SparkShutdownHookManager: runAll is called
2017/05/23 14:47:26.068 INFO SparkContext: Invoking stop() from shutdown
hook
2017/05/23 14:47:26.156 INFO SparkUI: Stopped Spark web UI at
http://192.168.50.127:4040


Thanks!
Xiaoye

Reply via email to