Github user yhuai commented on a diff in the pull request:

    https://github.com/apache/spark/pull/7027#discussion_r33330365
  
    --- Diff: 
core/src/test/scala/org/apache/spark/deploy/SparkSubmitSuite.scala ---
    @@ -548,6 +548,7 @@ object JarCreationTest extends Logging {
         if (result.nonEmpty) {
           throw new Exception("Could not load user class from jar:\n" + 
result(0))
         }
    +    sc.stop()
    --- End diff --
    
    It actually helps. I added it in `HiveSparkSubmitSuite` in 
https://github.com/apache/spark/pull/7009 and they have not failed since that 
pr got merged. From the log I attached to SPARK-8643, seems after the 
application finishes, executors got killed before we call `stop` from the 
shutdown hook.
    ```
    err> 15/06/24 18:24:05 INFO DAGScheduler: Job 3 finished: main at 
NativeMethodAccessorImpl.java:-2, took 0.674795 s
    err> 15/06/24 18:24:05 INFO ExecutorRunner: Killing process!
    err> 15/06/24 18:24:05 INFO ExecutorRunner: Killing process!
    err> 15/06/24 18:24:05 ERROR TaskSchedulerImpl: Lost executor 1 on 
192.168.10.24: remote Rpc client disassociated
    ...
    err> 15/06/24 18:24:05 INFO ExecutorRunner: Launch command: ...
    err> 15/06/24 18:24:06 INFO Worker: Executor app-20150624182313-0000/0 
finished with state EXITED message Command exited with code 143 exitStatus 143
    err> 15/06/24 18:24:06 INFO SparkContext: Invoking stop() from shutdown hook
    ```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to