[ 
https://issues.apache.org/jira/browse/SPARK-8366?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

meiyoula updated SPARK-8366:
----------------------------
    Description: 
I use the *dynamic executor allocation* function. 
When an executor is killed, all running tasks on it will be failed. Until reach 
the maxTaskFailures, this failed task will re-run with a new task id. 
But the `ExecutorAllocationManager` won't concern this new tasks to pending 
tasks, because the total stage task number only set when stage submitted.

  was:I use the *dynamic executor allocation* function. Then one executor is 
killed, all running tasks on it are failed. When the new tasks are appended, 
the new executor won't added.


> When task fails and append a new one, the ExecutorAllocationManager can't 
> sense the new tasks
> ---------------------------------------------------------------------------------------------
>
>                 Key: SPARK-8366
>                 URL: https://issues.apache.org/jira/browse/SPARK-8366
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.4.0
>            Reporter: meiyoula
>
> I use the *dynamic executor allocation* function. 
> When an executor is killed, all running tasks on it will be failed. Until 
> reach the maxTaskFailures, this failed task will re-run with a new task id. 
> But the `ExecutorAllocationManager` won't concern this new tasks to pending 
> tasks, because the total stage task number only set when stage submitted.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to