[
https://issues.apache.org/jira/browse/SPARK-7699?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14550249#comment-14550249
]
Sean Owen commented on SPARK-7699:
----------------------------------
Isn't this the normal state of a Spark app, run with spark-submit? spark-shell
is the more unusual case, where a Spark app sits there submitting no work at
the outset. updateAndSyncNumExecutorsTarget is called regularly, right? If
there's no load, I expect the number of executors to quickly reach the minimum.
How long are you expecting the initial setting to override this logic?
> Config "spark.dynamicAllocation.initialExecutors" has no effect
> ----------------------------------------------------------------
>
> Key: SPARK-7699
> URL: https://issues.apache.org/jira/browse/SPARK-7699
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Reporter: meiyoula
>
> spark.dynamicAllocation.minExecutors 2
> spark.dynamicAllocation.initialExecutors 3
> spark.dynamicAllocation.maxExecutors 4
> Just run the spark-shell with above configurations, the initial executor
> number is 2.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]