[
https://issues.apache.org/jira/browse/SPARK-14261?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15275238#comment-15275238
]
Weizhong edited comment on SPARK-14261 at 5/7/16 1:32 PM:
----------------------------------------------------------
I also face this issue.
I found every session will add one metastore connection, and when i dump the
heap, i found have many Hive and HiveConf object don't be removed
was (Author: sephiroth-lin):
I also face this issue.
I found every session will add one metastore connection, and when i dump the
heap, i found have many Hive and HiveConf object don't be released
> Memory leak in Spark Thrift Server
> ----------------------------------
>
> Key: SPARK-14261
> URL: https://issues.apache.org/jira/browse/SPARK-14261
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 1.6.0
> Reporter: Xiaochun Liang
> Attachments: 16716_heapdump_64g.PNG, 16716_heapdump_80g.PNG,
> MemorySnapshot.PNG
>
>
> I am running Spark Thrift server on Windows Server 2012. The Spark Thrift
> server is launched as Yarn client mode. Its memory usage is increased
> gradually with the queries in. I am wondering there is memory leak in Spark
> Thrift server.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]