[ 
https://issues.apache.org/jira/browse/SPARK-29055?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

George Papa updated SPARK-29055:
--------------------------------
    Attachment: image-2019-09-11-16-13-32-650.png

> Memory leak in Spark Driver
> ---------------------------
>
>                 Key: SPARK-29055
>                 URL: https://issues.apache.org/jira/browse/SPARK-29055
>             Project: Spark
>          Issue Type: Bug
>          Components: Block Manager, Spark Core
>    Affects Versions: 2.3.3, 2.4.0, 2.4.1, 2.4.2, 2.4.3, 2.4.4
>            Reporter: George Papa
>            Priority: Major
>         Attachments: image-2019-09-11-16-13-20-588.png
>
>
> In Spark 2.3.3+ the driver memory is increasing continuously. I don't have 
> this issue with Spark 2.1.1.
> In Spark 2.1.1 I see the ContextCleaner runs and cleans the driver and 
> BlockManager removes the broadcast blocks from the memory, as you can see in 
> the following screenshot:
> In Spark 2.3.3+ the driver memory is increasing continuously. I don't have 
> this issue with Spark 2.1.1.
> In Spark 2.1.1 I see the ContextCleaner runs and cleans the driver and 
> BlockManager removes the broadcast blocks from the memory, as you can see in 
> the following screenshot:
> !image-2019-09-11-16-13-20-588.png|width=685,height=89!
> But in Spark 2.3.3+ I don't see this cleaning and the driver storage 
> increases!!
> *NOTE:* After few hours of use I have application interruption with the 
> following error :
> {color:#FF0000}java.lang.OutOfMemoryError: GC overhead limit exceeded{color}
>  
> But in Spark 2.3.3+ I don't see this cleaning and the driver storage 
> increases!!
> *NOTE:* After few hours of use I have application interruption with the 
> following error :
> {color:#ff0000}java.lang.OutOfMemoryError: GC overhead limit exceeded{color}
>  



--
This message was sent by Atlassian Jira
(v8.3.2#803003)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to