[ 
https://issues.apache.org/jira/browse/SPARK-13183?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

dylanzhou updated SPARK-13183:
------------------------------
    Description: 
When I used sparkstreamimg and sparksql, found that old gen increases very fast 
and full GC is very frequent, running for a period of time will be out of 
memory, after analysis of heap memory, found that there are a large number of 
org.apache.spark.sql.columnar.ColumnBuilder[38] @ 0xd022a0b8, takes up 90% of 
the space, look at the source is HeapByteBuffer occupy, don't know why these 
objects are not released, had been waiting for GC to recycle


  was:
When I used sparkstreamimg and sparksql used, found that old gen increases very 
fast and full GC is very frequent, running for a period of time will be out of 
memory, after analysis of heap memory, found that there are a large number of 
org. Apache. Spark. SQL. Columnar. ColumnBuilder [38] @ 0 xd022a0b8 object, 
takes up 90% of the space, look at the source is HeapByteBuffer occupy, don't 
know why these objects are not released, had been waiting for GC to recycle



> Bytebuffers occupy a large amount of heap memory
> ------------------------------------------------
>
>                 Key: SPARK-13183
>                 URL: https://issues.apache.org/jira/browse/SPARK-13183
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 1.4.1
>            Reporter: dylanzhou
>
> When I used sparkstreamimg and sparksql, found that old gen increases very 
> fast and full GC is very frequent, running for a period of time will be out 
> of memory, after analysis of heap memory, found that there are a large number 
> of org.apache.spark.sql.columnar.ColumnBuilder[38] @ 0xd022a0b8, takes up 90% 
> of the space, look at the source is HeapByteBuffer occupy, don't know why 
> these objects are not released, had been waiting for GC to recycle



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to