[
https://issues.apache.org/jira/browse/SPARK-8366?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
meiyoula updated SPARK-8366:
----------------------------
Description: I use the *dynamic executor allocation* function. Then one
executor is killed, all running tasks on it are failed. When the new tasks are
appended, the new executor won't added. (was: I use the *dynamic executor
allocation* function. Then one executor is killed, all the tasks on it are
failed. When the new tasks are appended, the new executor won't added.)
> When task fails and append a new one, the ExecutorAllocationManager can't
> sense the new tasks
> ---------------------------------------------------------------------------------------------
>
> Key: SPARK-8366
> URL: https://issues.apache.org/jira/browse/SPARK-8366
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 1.4.0
> Reporter: meiyoula
>
> I use the *dynamic executor allocation* function. Then one executor is
> killed, all running tasks on it are failed. When the new tasks are appended,
> the new executor won't added.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]