[
https://issues.apache.org/jira/browse/SPARK-7699?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14550357#comment-14550357
]
Saisai Shao commented on SPARK-7699:
------------------------------------
I think if we have lots of pending tasks, the actual requested executors will
be larger than the minimum number of executors at the start time.
IIUC the problem here is no matter what we set on {{initialExecutors}}, this
configuration will not be effective, the requested executor number will either
be {{maxNeeded}} or {{minNumExecutors}}.
> Config "spark.dynamicAllocation.initialExecutors" has no effect
> ----------------------------------------------------------------
>
> Key: SPARK-7699
> URL: https://issues.apache.org/jira/browse/SPARK-7699
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Reporter: meiyoula
>
> spark.dynamicAllocation.minExecutors 2
> spark.dynamicAllocation.initialExecutors 3
> spark.dynamicAllocation.maxExecutors 4
> Just run the spark-shell with above configurations, the initial executor
> number is 2.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]