[
https://issues.apache.org/jira/browse/SPARK-9103?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Zhang, Liye updated SPARK-9103:
-------------------------------
Attachment: (was: Tracking Spark Memory Usage - Phase 1.pdf)
> Tracking spark's memory usage
> -----------------------------
>
> Key: SPARK-9103
> URL: https://issues.apache.org/jira/browse/SPARK-9103
> Project: Spark
> Issue Type: Umbrella
> Components: Spark Core, Web UI
> Reporter: Zhang, Liye
>
> Currently spark only provides little memory usage information (RDD cache on
> webUI) for the executors. User have no idea on what is the memory consumption
> when they are running spark applications with a lot of memory used in spark
> executors. Especially when they encounter the OOM, it’s really hard to know
> what is the cause of the problem. So it would be helpful to give out the
> detail memory consumption information for each part of spark, so that user
> can clearly have a picture of where the memory is exactly used.
> The memory usage info to expose should include but not limited to shuffle,
> cache, network, serializer, etc.
> User can optionally choose to open this functionality since this is mainly
> for debugging and tuning.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]