[ 
https://issues.apache.org/jira/browse/HDFS-16533?focusedWorklogId=793953&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-793953
 ]

ASF GitHub Bot logged work on HDFS-16533:
-----------------------------------------

                Author: ASF GitHub Bot
            Created on: 21/Jul/22 22:28
            Start Date: 21/Jul/22 22:28
    Worklog Time Spent: 10m 
      Work Description: jojochuang commented on code in PR #4155:
URL: https://github.com/apache/hadoop/pull/4155#discussion_r927145405


##########
hadoop-hdfs-project/hadoop-hdfs-client/src/main/java/org/apache/hadoop/hdfs/FileChecksumHelper.java:
##########
@@ -303,7 +303,8 @@ FileChecksum makeCompositeCrcResult() throws IOException {
       byte[] blockChecksumBytes = blockChecksumBuf.getData();
 
       long sumBlockLengths = 0;
-      for (int i = 0; i < locatedBlocks.size() - 1; ++i) {
+      int i = 0;

Review Comment:
   is this change necessary?





Issue Time Tracking
-------------------

    Worklog Id:     (was: 793953)
    Time Spent: 3h 50m  (was: 3h 40m)

> COMPOSITE_CRC failed between replicated file and striped file.
> --------------------------------------------------------------
>
>                 Key: HDFS-16533
>                 URL: https://issues.apache.org/jira/browse/HDFS-16533
>             Project: Hadoop HDFS
>          Issue Type: Bug
>          Components: hdfs, hdfs-client
>            Reporter: ZanderXu
>            Assignee: ZanderXu
>            Priority: Major
>              Labels: pull-request-available
>          Time Spent: 3h 50m
>  Remaining Estimate: 0h
>
> After testing the COMPOSITE_CRC with some random length between replicated 
> file and striped file which has same data with replicated file, it failed. 
> Reproduce step like this:
> {code:java}
> @Test(timeout = 90000)
> public void testStripedAndReplicatedFileChecksum2() throws Exception {
>   int abnormalSize = (dataBlocks * 2 - 2) * blockSize +
>       (int) (blockSize * 0.5);
>   prepareTestFiles(abnormalSize, new String[] {stripedFile1, replicatedFile});
>   int loopNumber = 100;
>   while (loopNumber-- > 0) {
>     int verifyLength = ThreadLocalRandom.current()
>         .nextInt(10, abnormalSize);
>     FileChecksum stripedFileChecksum1 = getFileChecksum(stripedFile1,
>         verifyLength, false);
>     FileChecksum replicatedFileChecksum = getFileChecksum(replicatedFile,
>         verifyLength, false);
>     if (checksumCombineMode.equals(ChecksumCombineMode.COMPOSITE_CRC.name())) 
> {
>       Assert.assertEquals(stripedFileChecksum1, replicatedFileChecksum);
>     } else {
>       Assert.assertNotEquals(stripedFileChecksum1, replicatedFileChecksum);
>     }
>   }
> } {code}
> And after tracing the root cause, `FileChecksumHelper#makeCompositeCrcResult` 
> maybe compute an error `consumedLastBlockLength` when updating checksum for 
> the last block of the fixed length which maybe not the last block in the file.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to