[
https://issues.apache.org/jira/browse/HDFS-6903?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14109905#comment-14109905
]
Colin Patrick McCabe commented on HDFS-6903:
--------------------------------------------
Hi Ayappan,
It seems like you have two options here:
1. disable native checksumming for big-endian (easy choice)
2. fix native checksumming for big-endian
I'd be happy to review a patch for either, but I don't have access to a
powerPC, sorry
> Crc32 checksum errors in Big-Endian Architecture
> ------------------------------------------------
>
> Key: HDFS-6903
> URL: https://issues.apache.org/jira/browse/HDFS-6903
> Project: Hadoop HDFS
> Issue Type: Bug
> Components: test
> Affects Versions: 3.0.0, 2.4.1, 2.6.0
> Environment: PowerPC RHEL 7 & 6.5 ( ppc64 - Big-Endian )
> Reporter: Ayappan
> Priority: Blocker
>
> Native Crc32 checksum calculation is not handled in Big-Endian
> Architecture.In this case, the platform is ppc64. Due to this several
> testcases in HDFS module fails.
> Running org.apache.hadoop.hdfs.TestAppendDifferentChecksum
> Tests run: 3, Failures: 0, Errors: 2, Skipped: 1, Time elapsed: 13.274 sec
> <<< FAILURE! - in org.apache.hadoop.hdfs.TestAppendDifferentChecksum
> testAlgoSwitchRandomized(org.apache.hadoop.hdfs.TestAppendDifferentChecksum)
> Time elapsed: 7.141 sec <<< ERROR!
> java.io.IOException: p=/testAlgoSwitchRandomized, length=28691, i=12288
> at org.apache.hadoop.util.NativeCrc32.nativeVerifyChunkedSums(Native
> Method)
> at
> org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:57)
> at
> org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:291)
> at
> org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:202)
> at
> org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:137)
> at
> org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:682)
> at
> org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:738)
> at
> org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:795)
> at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:836)
> at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:644)
> at java.io.FilterInputStream.read(FilterInputStream.java:83)
> at
> org.apache.hadoop.hdfs.AppendTestUtil.check(AppendTestUtil.java:129)
> at
> org.apache.hadoop.hdfs.TestAppendDifferentChecksum.testAlgoSwitchRandomized(TestAppendDifferentChecksum.java:130)
> testSwitchAlgorithms(org.apache.hadoop.hdfs.TestAppendDifferentChecksum)
> Time elapsed: 1.394 sec <<< ERROR!
> java.io.IOException: p=/testSwitchAlgorithms, length=3000, i=0
> at org.apache.hadoop.util.NativeCrc32.nativeVerifyChunkedSums(Native
> Method)
> at
> org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:57)
> at
> org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:291)
> at
> org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:202)
> at
> org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:137)
> at
> org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:682)
> at
> org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:738)
> at
> org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:795)
> at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:836)
> at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:644)
> at java.io.FilterInputStream.read(FilterInputStream.java:83)
> at
> org.apache.hadoop.hdfs.AppendTestUtil.check(AppendTestUtil.java:129)
> at
> org.apache.hadoop.hdfs.TestAppendDifferentChecksum.testSwitchAlgorithms(TestAppendDifferentChecksum.java:94)
--
This message was sent by Atlassian JIRA
(v6.2#6252)