Ryan Blue created SPARK-13688:
---------------------------------

             Summary: Add option to use dynamic allocation even if 
spark.executor.instances is set.
                 Key: SPARK-13688
                 URL: https://issues.apache.org/jira/browse/SPARK-13688
             Project: Spark
          Issue Type: Bug
            Reporter: Ryan Blue


When both spark.dynamicAllocation.enabled and spark.executor.instances are set, 
dynamic resource allocation is disabled (see SPARK-9092). This is a reasonable 
default, but I think there should be a configuration property to override it 
because it isn't obvious to users that dynamic allocation and number of 
executors are mutually exclusive. We see users setting --num-executors because 
that looks like what they want: a way to get more executors.

I propose adding a new boolean property, 
spark.dynamicAllocation.overrideNumExecutors, that makes dynamic allocation the 
default when both are set and uses --num-executors as the minimum number of 
executors.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to