Xuefu Zhang created HIVE-15543:
----------------------------------

             Summary: Don't try to get memory/cores to decide parallelism when 
Spark dynamic allocation is enabled
                 Key: HIVE-15543
                 URL: https://issues.apache.org/jira/browse/HIVE-15543
             Project: Hive
          Issue Type: Improvement
          Components: Spark
    Affects Versions: 2.2.0
            Reporter: Xuefu Zhang
            Assignee: Xuefu Zhang


Presently Hive tries to get numbers for memory and cores from the Spark 
application and use them to determine RS parallelism. However, this doesn't 
make sense when Spark dynamic allocation is enabled because the current numbers 
doesn't represent available computing resources, especially when SparkContext 
is initially launched.

Thus, it makes send not to do that when dynamic allocation is enabled.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to