[ 
https://issues.apache.org/jira/browse/LOG4J2-505?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13909163#comment-13909163
 ] 

Tal Liron commented on LOG4J2-505:
----------------------------------

It does, thanks! It seems that Disruptor is indeed a stable solution.

> Memory leak with 
> org.apache.logging.log4j.core.async.AsyncLoggerConfigHelper$Log4jEventWrapper
> ----------------------------------------------------------------------------------------------
>
>                 Key: LOG4J2-505
>                 URL: https://issues.apache.org/jira/browse/LOG4J2-505
>             Project: Log4j 2
>          Issue Type: Bug
>          Components: Core
>    Affects Versions: 2.0-beta9
>            Reporter: Tal Liron
>            Assignee: Remko Popma
>             Fix For: 2.0-beta9
>
>
> Instances of this class seem to be created but never garbage collected. Here 
> is a jmap dump of the problem:
> https://dl.dropboxusercontent.com/u/122806/jvm8_gc2.zip
> Use jhat to analyze it: if you go to the instance count, you will see that 
> the aforementioned class is way out of control.
> Some background on how I discovered this, which may help: I am currently 
> working with the Oracle OpenJDK team to debug a memory leak that has existed 
> with JSR-292 (invokedynamic) that has been present since 7u40, and also 
> plagues OpenJDK 8 right now. The bug is prevalent in the Nashorn engine, 
> which is being shipped with JDK 8. Indeed, in the memory dump above, you'll 
> see that JSR-292 and Nashorn classes are also out of control -- but still 
> second to the log4j class!



--
This message was sent by Atlassian JIRA
(v6.1.5#6160)

---------------------------------------------------------------------
To unsubscribe, e-mail: log4j-dev-unsubscr...@logging.apache.org
For additional commands, e-mail: log4j-dev-h...@logging.apache.org

Reply via email to