[ 
https://issues.apache.org/jira/browse/HADOOP-1564?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#action_12511516
 ] 

Raghu Angadi commented on HADOOP-1564:
--------------------------------------

Dhruba,

Yesterday we started enforcing the condition that block size should be a 
multiple of io.bytes.per.checksum for new files. This test fails since one of 
the configs does not match. Could you update the patch with the correction?


> Write unit tests to detect CRC corruption
> -----------------------------------------
>
>                 Key: HADOOP-1564
>                 URL: https://issues.apache.org/jira/browse/HADOOP-1564
>             Project: Hadoop
>          Issue Type: Bug
>          Components: dfs
>            Reporter: dhruba borthakur
>            Assignee: dhruba borthakur
>         Attachments: crctest.4
>
>
> The unit tests should have some way to test the case when CRC files are 
> corrupted.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to