[ 
https://issues.apache.org/jira/browse/SPARK-16083?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15341840#comment-15341840
 ] 

Sean Owen commented on SPARK-16083:
-----------------------------------

OK, that does looks like the JVM is getting the option, though it looks like 
you are not using Spark scripts to run this.
Yes, I looked at the heap dump and log. VisualVM says this is a 55MB heap dump. 
It is possible that off-heap memory is somehow leaking.

Can you try a more recent version of Spark? 1.5 is somewhat old now, and I 
wouldn't be surprised if something has been fixed since then. Usually we report 
JIRAs vs master whenever possible.

> spark HistoryServer memory increases until gets killed by OS.
> -------------------------------------------------------------
>
>                 Key: SPARK-16083
>                 URL: https://issues.apache.org/jira/browse/SPARK-16083
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Submit
>    Affects Versions: 1.5.1
>         Environment: RHEL-6, IOP-4.1.0.0, 10 node cluster
>            Reporter: Sudhakar Thota
>         Attachments: 27814.004.000.sparkhistoryservermemory.log.1, 
> 27814004000_spark_heap_2016-06-17_12-05.hprof.gz, 
> spark-spark-org.apache.spark.deploy.history.HistoryServer-1-testbic1on5l.out.4
>
>
> Spark HistoryServer process consuming memory over few days finally getting 
> killed by operating system.  Heapdump analysis of jmap dumps with IBM 
> HeapAnalyzer, found that total heap size is 800M and process size is 11G. 
> HistoryServer is started with 1G. ("history-server -Xms1g -Xmx1g 
> org.apache.spark.deploy.history.HistoryServer")



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to