[
https://issues.apache.org/jira/browse/HADOOP-9114?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Uma Maheswara Rao G updated HADOOP-9114:
----------------------------------------
Resolution: Fixed
Fix Version/s: 2.2.1
2.3.0
3.0.0
Hadoop Flags: Reviewed
Status: Resolved (was: Patch Available)
Thanks Sathish for providing fix. I have just committed this to trunk, branch-2
and 2.2. Note: Your previous patch includes files from both HDFS and common
files. I have committed only src part here and attached HADOOP-9114-002.1.patch
what I committed as we need to keep changes lists in CHANGES.txt in respective
project. Please file separate one in HDFS project for test if you want. I
understand that testing checksumtype needed to add test in HDFS side.
> After defined the dfs.checksum.type as the NULL, write file and hflush will
> through java.lang.ArrayIndexOutOfBoundsException
> ----------------------------------------------------------------------------------------------------------------------------
>
> Key: HADOOP-9114
> URL: https://issues.apache.org/jira/browse/HADOOP-9114
> Project: Hadoop Common
> Issue Type: Bug
> Affects Versions: 2.0.1-alpha
> Reporter: liuyang
> Assignee: sathish
> Priority: Minor
> Fix For: 3.0.0, 2.3.0, 2.2.1
>
> Attachments: FSOutputSummer.java.patch, HADOOP-9114-001.patch,
> HADOOP-9114-002.1.patch, HADOOP-9114-002.patch
>
>
> when I test the characteristic parameter about dfs.checksum.type. The value
> can be defined as NULL,CRC32C,CRC32. It's ok when the value is CRC32C or
> CRC32, but the client will through java.lang.ArrayIndexOutOfBoundsException
> when the value is configured NULL.
--
This message was sent by Atlassian JIRA
(v6.1#6144)