meiyoula created SPARK-11334:
--------------------------------

             Summary: numRunningTasks can't be less than 0, or it will refrect 
executor allocation
                 Key: SPARK-11334
                 URL: https://issues.apache.org/jira/browse/SPARK-11334
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
            Reporter: meiyoula


With Dynamic Allocation function, a task failed over maxFailure time, all the 
dependent jobs, stages, tasks will be killed or aborted. In this process, 
SparkListenerTaskEnd event will be behind in SparkListenerStageCompleted and 
SparkListenerJobEnd. Like the Event Log below:



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to