[ 
https://issues.apache.org/jira/browse/SPARK-31438?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17091225#comment-17091225
 ] 

Hyukjin Kwon commented on SPARK-31438:
--------------------------------------

PR https://github.com/apache/spark/pull/28280

> Support JobCleaned Status in SparkListener
> ------------------------------------------
>
>                 Key: SPARK-31438
>                 URL: https://issues.apache.org/jira/browse/SPARK-31438
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 3.1.0
>            Reporter: Jackey Lee
>            Priority: Major
>
> In Spark, we need do some hook after job cleaned, such as cleaning hive 
> external temporary paths. This has already discussed in SPARK-31346 and 
> [GitHub Pull Request #28129.|https://github.com/apache/spark/pull/28129]
>  The JobEnd Status is not suitable for this. As JobEnd is responsible for Job 
> finished, once all result has generated, it should be finished. After finish, 
> Scheduler will leave the still running tasks to be zombie tasks and delete 
> abnormal tasks asynchronously.
>  Thus, we add JobCleaned Status to enable user to do some hook after all 
> tasks cleaned in Job. The JobCleaned Status can get from TaskSetManagers, 
> which is related to a stage, and once all stages of the job has been cleaned, 
> then the job is cleaned.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to