Michael Procopio created SPARK-10453:
----------------------------------------
Summary: There's now way to use spark.dynmicAllocation.enabled
with pyspark
Key: SPARK-10453
URL: https://issues.apache.org/jira/browse/SPARK-10453
Project: Spark
Issue Type: Bug
Components: PySpark
Affects Versions: 1.4.0
Environment: When using spark.dynamicAllocation.enabled, the
assumption is that memory/core resources will be mediated by the yarn resource
manager. Unfortunately, whatever value is used for spark.executor.memory is
consumed as JVM heap space by the executor. There's no way to account for the
memory requirements of the pyspark worker. Executor JVM heap space should be
decoupled from spark.executor.memory.
Reporter: Michael Procopio
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]