[
https://issues.apache.org/jira/browse/SPARK-26750?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Apache Spark reassigned SPARK-26750:
------------------------------------
Assignee: Apache Spark
> Estimate memory overhead should taking multi-cores into account
> ---------------------------------------------------------------
>
> Key: SPARK-26750
> URL: https://issues.apache.org/jira/browse/SPARK-26750
> Project: Spark
> Issue Type: Improvement
> Components: YARN
> Affects Versions: 2.4.0
> Reporter: liupengcheng
> Assignee: Apache Spark
> Priority: Major
>
> Currently, spark esitmate the memory overhead without taking multi-cores into
> account, sometimes, it might cause direct memory oom, or killed by yarn for
> exceeding requested physical memory.
> I think the memory overhead is related to the executor's core number(mainly
> the spark direct memory and some related jvm native memory, for instance, the
> thread stacks, GC data etc.). so maybe we can improve this estimation by
> taking the core number into account.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]