[ 
https://issues.apache.org/jira/browse/AVRO-1111?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13453281#comment-13453281
 ] 

Mike Percy commented on AVRO-1111:
----------------------------------

Thanks Tom!
                
> Malformed data can cause OutOfMemoryError in Avro IPC
> -----------------------------------------------------
>
>                 Key: AVRO-1111
>                 URL: https://issues.apache.org/jira/browse/AVRO-1111
>             Project: Avro
>          Issue Type: Bug
>          Components: java
>    Affects Versions: 1.6.3
>            Reporter: Hari Shreedharan
>            Assignee: Mike Percy
>             Fix For: 1.7.2
>
>         Attachments: AVRO-1111-1.patch, AVRO-1111-2.patch
>
>
> If the data that comes in through the Netty channel buffer is not framed 
> correctly/is not valid Avro data, then the incoming data can cause 
> arbitrarily large array lists to be created, causing OutOfMemoryError. 
> The relevant code(org.apache.avro.ipc.NettyTransportCodec):
> private boolean decodePackHeader(ChannelHandlerContext ctx, Channel channel,
> ChannelBuffer buffer) throws Exception {
> if (buffer.readableBytes()<8) { return false; }
> int serial = buffer.readInt();
> listSize = buffer.readInt();
> dataPack = new NettyDataPack(serial, new ArrayList<ByteBuffer>(listSize));
> return true;
> }
> If the buffer does not have valid Avro data, the listSize variable can have 
> arbitrary values, causing massive ArrayLists to be created, leading to 
> OutOfMemoryErrors.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

Reply via email to