[ 
https://issues.apache.org/jira/browse/HADOOP-5302?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Eric Yang updated HADOOP-5302:
------------------------------

      Resolution: Fixed
    Release Note: 
What is new in HADOOP-5302:

   - Added check for record size bigger than MAX_SIZE.
    Hadoop Flags: [Reviewed]
          Status: Resolved  (was: Patch Available)

I just committed this.  Thanks Jerome.

> If a record is too big, the adaptor will stop sending chunks
> ------------------------------------------------------------
>
>                 Key: HADOOP-5302
>                 URL: https://issues.apache.org/jira/browse/HADOOP-5302
>             Project: Hadoop Core
>          Issue Type: Bug
>          Components: contrib/chukwa
>            Reporter: Jerome Boulon
>            Assignee: Jerome Boulon
>         Attachments: HADOOP-5302.patch
>
>


-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to