[
https://issues.apache.org/jira/browse/HADOOP-14907?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16181120#comment-16181120
]
Thomas Graves commented on HADOOP-14907:
----------------------------------------
Can you give more details on where the heap dump is from? It looks like you
are running Spark. Are you using the --keytab option?
> Memory leak in FileSystem cache
> -------------------------------
>
> Key: HADOOP-14907
> URL: https://issues.apache.org/jira/browse/HADOOP-14907
> Project: Hadoop Common
> Issue Type: Bug
> Components: hdfs-client
> Affects Versions: 2.7.4
> Reporter: cen yuhai
> Attachments: screenshot-1.png, screenshot-2.png
>
>
> There is a memory leak in FileSystem cache. It will take a lot of memory.I
> think the root cause is that the equals function in class Key is not right.
> You can see in the screenshot-1.png, the same user etl is in different key...
> And also FileSystem cache should be a LRU cache
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]