Hi,

going through the mailing list archive I couldn't find any mention of
the problem/issue that I've encountered.

Basically I have a CumulativeProtocolDecoder that has to accumulate
some data internally in addition to 'regular' messages; sort of an
out-of-band channel. To do this I'm storing an IoBuffer instance as
session attribute.

Now here's the problem: The buffer I'm storing is allocated as
IoBuffer.allocate(0,false) and then set to auto-expand via
setAutoExpand(true). This however always gets me the same buffer
instance which is then falsely shared between all my sessions; more
precisely it is the static EMPTY_HEAP_BUFFER inside IoBuffer.

Now my questions:
1.) Why does allocate perform this "optimization" of returning the
same empty buffer when an initial zero capacity buffer is requested?
2.) If this is indeed done on purpose, should these buffers then still
be allowed to grow?

Cheers,

Martin Hoffesommer

Reply via email to