[
https://issues.apache.org/jira/browse/SPARK-25380?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16616675#comment-16616675
]
Nir Hedvat edited comment on SPARK-25380 at 9/16/18 11:21 AM:
--------------------------------------------------------------
Experiencing the same problem
!image-2018-09-16-14-21-38-939.png!
was (Author: nir hedvat):
Same problem here (using Spark 2.3.1)
> Generated plans occupy over 50% of Spark driver memory
> ------------------------------------------------------
>
> Key: SPARK-25380
> URL: https://issues.apache.org/jira/browse/SPARK-25380
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 2.3.1
> Environment: Spark 2.3.1 (AWS emr-5.16.0)
>
> Reporter: Michael Spector
> Priority: Minor
> Attachments: Screen Shot 2018-09-06 at 23.19.56.png, Screen Shot
> 2018-09-12 at 8.20.05.png, heapdump_OOM.png, image-2018-09-16-14-21-38-939.png
>
>
> When debugging an OOM exception during long run of a Spark application (many
> iterations of the same code) I've found that generated plans occupy most of
> the driver memory. I'm not sure whether this is a memory leak or not, but it
> would be helpful if old plans could be purged from memory anyways.
> Attached are screenshots of OOM heap dump opened in JVisualVM.
>
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]