Darcy Shen created SPARK-23974:

             Summary: Do not allocate more containers as expected in dynamic 
                 Key: SPARK-23974
                 URL: https://issues.apache.org/jira/browse/SPARK-23974
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 2.1.1
            Reporter: Darcy Shen

Using Yarn with dynamic allocation enabled, spark does not allocate more 
containers when current containers(executors) number is less than the max 
executor num.

For example, we only have 7 executors working, while our cluster is not busy, 
and I have set

{{ spark.dynamicAllocation.maxExecutors = 600}}

{{and the current jobs of the context are executed slowly.}}

This message was sent by Atlassian JIRA

To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to