[ 
https://issues.apache.org/jira/browse/SPARK-14261?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15275238#comment-15275238
 ] 

Weizhong edited comment on SPARK-14261 at 5/10/16 6:52 AM:
-----------------------------------------------------------

I also face this issue.
I found each session will add one HiveConf on sun.misc.Launcher$AppClassLoader 
and one on org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1. All 
these HiveConf can't be released until OOM.
>From the OOM dump(Use Eclipse Memory Analyzer to analyze), all these HiveConf 
>have ref, it's GC root like below:
{noformat}
org.apache.hadoop.hive.conf.HiveConf
  conf org.apache.hadoop.hive.ql.session.SessionState$SessionStates
    value java.lang.ThreadLocal$ThreadLocalMap$Entry
      [19]  java.lang.ThreadLocal$ThreadLocalMap$Entry[32]
        table java.lang.ThreadLocal$ThreadLocalMap
          threadLocals java.lang.Thread
  referent java.util.WeakHashMap$Entry
  conf org.apache.hadoop.hive.ql.session.SessionState
    state org.apache.spark.sql.hive.client.ClientWrapper
      metaHive, metadataHive, metaHive 
org.apache.spark.sql.hive.client.HiveContext
        $outer org.apache.spark.sql.SQLContext$$anon$4
        [265] java.lang.Object[267]
          array java.util.concurrent.CopyWriteArrayList
            listeners org.apache.spark.scheduler.LiveListenerBus
              $outer org.apache.spark.util.AsynchronousListenerBus$$anon$1
{noformat}


was (Author: sephiroth-lin):
I also face this issue.
I found every session will add one metastore connection, and when i dump the 
heap, i found have many Hive and HiveConf object don't be removed

> Memory leak in Spark Thrift Server
> ----------------------------------
>
>                 Key: SPARK-14261
>                 URL: https://issues.apache.org/jira/browse/SPARK-14261
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.6.0
>            Reporter: Xiaochun Liang
>         Attachments: 16716_heapdump_64g.PNG, 16716_heapdump_80g.PNG, 
> MemorySnapshot.PNG
>
>
> I am running Spark Thrift server on Windows Server 2012. The Spark Thrift 
> server is launched as Yarn client mode. Its memory usage is increased 
> gradually with the queries in.  I am wondering there is memory leak in Spark 
> Thrift server.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to