[ 
https://issues.apache.org/jira/browse/SPARK-8974?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14623115#comment-14623115
 ] 

KaiXinXIaoLei commented on SPARK-8974:
--------------------------------------

I test this case. When  the state of ApplicationMaster is dead or disconnected, 
and tasks are submitted, Executors that have been requested to register. But 
the new ApplicationMaster does not start, so the thread of 
spark-dynamic-executor-allocation will throw exception and retry three 
times(default). So when new ApplicationMaster start, the thread of 
spark-dynamic-executor-allocation is dead and is not recovered. So executor 
allocation is not supported. 

> The spark-dynamic-executor-allocation may be not supported
> ----------------------------------------------------------
>
>                 Key: SPARK-8974
>                 URL: https://issues.apache.org/jira/browse/SPARK-8974
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.4.0
>            Reporter: KaiXinXIaoLei
>             Fix For: 1.5.0
>
>
> In yarn-client mode and config option "spark.dynamicAllocation.enabled " is 
> true, when the state of ApplicationMaster is dead or disconnected, if the 
> tasks are submitted  before new ApplicationMaster start. The thread of 
> spark-dynamic-executor-allocation will throw exception, When 
> ApplicationMaster is running and not tasks are running, the number of 
> executor is not zero. So feture of dynamicAllocation are not  supported.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to