[ 
https://issues.apache.org/jira/browse/SPARK-25380?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16691519#comment-16691519
 ] 

Dave DeCaprio commented on SPARK-25380:
---------------------------------------

I would like to comment that we are also seeing this.  200Mb plans are not 
unusual for us.

> Generated plans occupy over 50% of Spark driver memory
> ------------------------------------------------------
>
>                 Key: SPARK-25380
>                 URL: https://issues.apache.org/jira/browse/SPARK-25380
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.3.1
>         Environment: Spark 2.3.1 (AWS emr-5.16.0)
>  
>            Reporter: Michael Spector
>            Priority: Minor
>         Attachments: Screen Shot 2018-09-06 at 23.19.56.png, Screen Shot 
> 2018-09-12 at 8.20.05.png, heapdump_OOM.png, image-2018-09-16-14-21-38-939.png
>
>
> When debugging an OOM exception during long run of a Spark application (many 
> iterations of the same code) I've found that generated plans occupy most of 
> the driver memory. I'm not sure whether this is a memory leak or not, but it 
> would be helpful if old plans could be purged from memory anyways.
> Attached are screenshots of OOM heap dump opened in JVisualVM.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to