[
https://issues.apache.org/jira/browse/HDFS-10691?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15395085#comment-15395085
]
Yiqun Lin edited comment on HDFS-10691 at 7/27/16 5:55 AM:
-----------------------------------------------------------
I made a test in my local env, I found the description for this jira seems not
correct. The size for here should be in ((maxSize/step)*step, maxSize]. Post
the new patch with the new test. Can run the test for this issue, I have tested
in my local env, if I didn't apply the code with my patch, the exception will
be threw in this test. Thanks for review.
was (Author: linyiqun):
I made a test in my local env, I found the description for this jira seems not
correct. The size for here should be in ((maxSize/step)*step, maxSize]. Post
the new patch with the new test. Can run the test for this issue, I have tested
in my local env, if I didn't change the code in my patch, the exception will be
threw in this test. Thanks for review.
> FileDistribution fails in hdfs oiv command due to
> ArrayIndexOutOfBoundsException
> --------------------------------------------------------------------------------
>
> Key: HDFS-10691
> URL: https://issues.apache.org/jira/browse/HDFS-10691
> Project: Hadoop HDFS
> Issue Type: Bug
> Affects Versions: 2.7.1
> Reporter: Yiqun Lin
> Assignee: Yiqun Lin
> Attachments: HDFS-10691.001.patch, HDFS-10691.002.patch
>
>
> I use hdfs oiv -p FileDistribution command to do a file analyse. But the
> {{ArrayIndexOutOfBoundsException}} happened and lead the process terminated.
> The stack infos:
> {code}
> Exception in thread "main" java.lang.ArrayIndexOutOfBoundsException: 103
> at
> org.apache.hadoop.hdfs.tools.offlineImageViewer.FileDistributionCalculator.run(FileDistributionCalculator.java:243)
> at
> org.apache.hadoop.hdfs.tools.offlineImageViewer.FileDistributionCalculator.visit(FileDistributionCalculator.java:176)
> at
> org.apache.hadoop.hdfs.tools.offlineImageViewer.OfflineImageViewerPB.run(OfflineImageViewerPB.java:176)
> at
> org.apache.hadoop.hdfs.tools.offlineImageViewer.OfflineImageViewerPB.main(OfflineImageViewerPB.java:129)
> {code}
> I looked into the code and I found the exception was threw in increasing
> count of {{distribution}}. And the reason for the exception is that the
> bucket number was more than the distribution's length.
> Here are my steps:
> 1).The input command params:
> {code}
> hdfs oiv -p FileDistribution -maxSize 104857600 -step 1024000
> {code}
> The {{numIntervals}} in code should be 104857600/1024000 =102(real
> value:102.4), so the {{distribution}}'s length should be {{numIntervals}} + 1
> = 103.
> 2).The {{ArrayIndexOutOfBoundsException}} will happens when the file size is
> in range ((maxSize/step)*step, maxSize]. For example, if the size of one file
> is 104800000, and it's in range of size as mentioned before. And the bucket
> number is calculated as 104800000/1024000=102.3, then in code we do the
> {{Math.ceil}} of this, so the final value should be 103. But the
> {{distribution}}'s length is also 103, it means the index is from 0 to 102.
> So the {{ArrayIndexOutOfBoundsException}} happens.
> In a word, the exception will happens when {{maxSize}} can not be divided by
> {{step}} and meanwhile the size of file is in range ((maxSize/step)*step,
> maxSize]. The related logic should be changed from
> {code}
> int bucket = fileSize > maxSize ? distribution.length - 1 : (int) Math
> .ceil((double)fileSize / steps);
> {code}
> to
> {code}
> int bucket =
> fileSize >= maxSize || fileSize > (maxSize / steps) * steps ?
> distribution.length - 1 : (int) Math.ceil((double) fileSize /
> steps);
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]