meiyoula created SPARK-8099:
-------------------------------
Summary: In yarn-cluster mode, "--executor-cores" can't be setted
into SparkConf
Key: SPARK-8099
URL: https://issues.apache.org/jira/browse/SPARK-8099
Project: Spark
Issue Type: Bug
Components: YARN
Reporter: meiyoula
While testing dynamic executor allocation function, I set the executor cores
with `--executor-cores 4` in spark-submit command. But in
`ExecutorAllocationManager`, the ` private val tasksPerExecutor
=conf.getInt("spark.executor.cores", 1) / conf.getInt("spark.task.cpus", 1)` is
still to be 1.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]