Now I got it.  One possible point of memory consumption is
CumulativeProtocolDecoder which stores a ByteBuffer.  The author mentions
using timeout to minimize the memory consumption.  But what happens after
timeout?  It depends on the details of protocol specification we are
implementing.  We can easily insert the code that discards the buffer when
decode() method is not being called for certain amount of time in
CumulativeProtocolDecoder though.  But is this what we really need?

Trustin

On 7/3/06, Trustin Lee <[EMAIL PROTECTED]> wrote:

On 7/3/06, Michael Bauroth <[EMAIL PROTECTED]> wrote:

> I found the following link while a search about memory leaks in NIO:
>
> http://weblogs.java.net/blog/jfarcand/archive/2006/06/tricks_and_tips.html
>


Can we say this is a memory leak?  I don't think so.  We can reduce the
amount of memory used per connection, but it's not a memory leak because
it's released when a connection is closed.  BTW MINA doesn't store
ByteBuffer per connection as you know.  MINA exactly follows the solution
this article suggests.  :)

W still need to minimize memory consumption, but it's not the highest
priority task for now because it's working fine in most production
environment.

If you're experiencing memory leak, why don't you try to create a heap
byte buffer allocator, set it as a default allocator, and take a heap
snapshot?

Trustin
--
what we call human nature is actually human habit
--
http://gleamynode.net/
--
PGP key fingerprints:
* E167 E6AF E73A CBCE EE41  4A29 544D DE48 FE95 4E7E
* B693 628E 6047 4F8F CFA4  455E 1C62 A7DC 0255 ECA6




--
what we call human nature is actually human habit
--
http://gleamynode.net/
--
PGP key fingerprints:
* E167 E6AF E73A CBCE EE41  4A29 544D DE48 FE95 4E7E
* B693 628E 6047 4F8F CFA4  455E 1C62 A7DC 0255 ECA6

Reply via email to