[jira] [Commented] (AVRO-1111) Malformed data can cause OutOfMemoryError in Avro IPC

2012-09-11 Thread Mike Percy (JIRA)

[ 
https://issues.apache.org/jira/browse/AVRO-?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13453281#comment-13453281
 ] 

Mike Percy commented on AVRO-:
--

Thanks Tom!

> Malformed data can cause OutOfMemoryError in Avro IPC
> -
>
> Key: AVRO-
> URL: https://issues.apache.org/jira/browse/AVRO-
> Project: Avro
>  Issue Type: Bug
>  Components: java
>Affects Versions: 1.6.3
>Reporter: Hari Shreedharan
>Assignee: Mike Percy
> Fix For: 1.7.2
>
> Attachments: AVRO--1.patch, AVRO--2.patch
>
>
> If the data that comes in through the Netty channel buffer is not framed 
> correctly/is not valid Avro data, then the incoming data can cause 
> arbitrarily large array lists to be created, causing OutOfMemoryError. 
> The relevant code(org.apache.avro.ipc.NettyTransportCodec):
> private boolean decodePackHeader(ChannelHandlerContext ctx, Channel channel,
> ChannelBuffer buffer) throws Exception {
> if (buffer.readableBytes()<8) { return false; }
> int serial = buffer.readInt();
> listSize = buffer.readInt();
> dataPack = new NettyDataPack(serial, new ArrayList(listSize));
> return true;
> }
> If the buffer does not have valid Avro data, the listSize variable can have 
> arbitrary values, causing massive ArrayLists to be created, leading to 
> OutOfMemoryErrors.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Commented] (AVRO-1111) Malformed data can cause OutOfMemoryError in Avro IPC

2012-09-10 Thread Tom White (JIRA)

[ 
https://issues.apache.org/jira/browse/AVRO-?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13451898#comment-13451898
 ] 

Tom White commented on AVRO-:
-

+1 for the patch (with Philip's 10% suggestion). I would also change the test 
to assert that no response was received, rather than just printing a message.

> Malformed data can cause OutOfMemoryError in Avro IPC
> -
>
> Key: AVRO-
> URL: https://issues.apache.org/jira/browse/AVRO-
> Project: Avro
>  Issue Type: Bug
>  Components: java
>Affects Versions: 1.6.3
>Reporter: Hari Shreedharan
> Attachments: AVRO--1.patch
>
>
> If the data that comes in through the Netty channel buffer is not framed 
> correctly/is not valid Avro data, then the incoming data can cause 
> arbitrarily large array lists to be created, causing OutOfMemoryError. 
> The relevant code(org.apache.avro.ipc.NettyTransportCodec):
> private boolean decodePackHeader(ChannelHandlerContext ctx, Channel channel,
> ChannelBuffer buffer) throws Exception {
> if (buffer.readableBytes()<8) { return false; }
> int serial = buffer.readInt();
> listSize = buffer.readInt();
> dataPack = new NettyDataPack(serial, new ArrayList(listSize));
> return true;
> }
> If the buffer does not have valid Avro data, the listSize variable can have 
> arbitrary values, causing massive ArrayLists to be created, leading to 
> OutOfMemoryErrors.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Commented] (AVRO-1111) Malformed data can cause OutOfMemoryError in Avro IPC

2012-09-10 Thread Mike Percy (JIRA)

[ 
https://issues.apache.org/jira/browse/AVRO-?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13451827#comment-13451827
 ] 

Mike Percy commented on AVRO-:
--

Phil, thank you very much for the ideas! I agree with trying harder to count 
the actual received bytes (deep count). Controlling the max-size on the server, 
while not requiring the client to provide it, looks more future proof to me.

> Malformed data can cause OutOfMemoryError in Avro IPC
> -
>
> Key: AVRO-
> URL: https://issues.apache.org/jira/browse/AVRO-
> Project: Avro
>  Issue Type: Bug
>  Components: java
>Affects Versions: 1.6.3
>Reporter: Hari Shreedharan
> Attachments: AVRO--1.patch
>
>
> If the data that comes in through the Netty channel buffer is not framed 
> correctly/is not valid Avro data, then the incoming data can cause 
> arbitrarily large array lists to be created, causing OutOfMemoryError. 
> The relevant code(org.apache.avro.ipc.NettyTransportCodec):
> private boolean decodePackHeader(ChannelHandlerContext ctx, Channel channel,
> ChannelBuffer buffer) throws Exception {
> if (buffer.readableBytes()<8) { return false; }
> int serial = buffer.readInt();
> listSize = buffer.readInt();
> dataPack = new NettyDataPack(serial, new ArrayList(listSize));
> return true;
> }
> If the buffer does not have valid Avro data, the listSize variable can have 
> arbitrary values, causing massive ArrayLists to be created, leading to 
> OutOfMemoryErrors.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira


[jira] [Commented] (AVRO-1111) Malformed data can cause OutOfMemoryError in Avro IPC

2012-09-09 Thread Philip Zeyliger (JIRA)

[ 
https://issues.apache.org/jira/browse/AVRO-?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13451727#comment-13451727
 ] 

Philip Zeyliger commented on AVRO-:
---

Mike, your patch seems like a pragmatic approach.  I'm +1 on the patch.  You 
might be even more conservative: 10% of the maximum memory seems like more than 
large enough.  Even 100MB seems large enough.

There are two other places we could annotate this sort of information.  We 
could annotate the protocol description to say "maxSize=100MB", to limit the 
size of arrays.  That requires a protocol change, and it also requires keeping 
track of the size of requests, which is tricky in its own way.

Another approach is to pass a "max size" to the transceiver when instantiating 
it.  An application might be able to say "never accept RPCs > 100MB" (in fact, 
that's a reasonable default).  If an application wants to use larger ones, it 
can configure the server appropriately, thereby bypassing the check.

Thoughts on these alternatives?

> Malformed data can cause OutOfMemoryError in Avro IPC
> -
>
> Key: AVRO-
> URL: https://issues.apache.org/jira/browse/AVRO-
> Project: Avro
>  Issue Type: Bug
>  Components: java
>Affects Versions: 1.6.3
>Reporter: Hari Shreedharan
> Attachments: AVRO--1.patch
>
>
> If the data that comes in through the Netty channel buffer is not framed 
> correctly/is not valid Avro data, then the incoming data can cause 
> arbitrarily large array lists to be created, causing OutOfMemoryError. 
> The relevant code(org.apache.avro.ipc.NettyTransportCodec):
> private boolean decodePackHeader(ChannelHandlerContext ctx, Channel channel,
> ChannelBuffer buffer) throws Exception {
> if (buffer.readableBytes()<8) { return false; }
> int serial = buffer.readInt();
> listSize = buffer.readInt();
> dataPack = new NettyDataPack(serial, new ArrayList(listSize));
> return true;
> }
> If the buffer does not have valid Avro data, the listSize variable can have 
> arbitrary values, causing massive ArrayLists to be created, leading to 
> OutOfMemoryErrors.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira