Yiqun Lin created HDFS-10691:
--------------------------------

             Summary: FileDistribution fails in hdfs oiv command due to 
ArrayIndexOutOfBoundsException
                 Key: HDFS-10691
                 URL: https://issues.apache.org/jira/browse/HDFS-10691
             Project: Hadoop HDFS
          Issue Type: Bug
    Affects Versions: 2.7.1
            Reporter: Yiqun Lin
            Assignee: Yiqun Lin


I use hdfs oiv -p FileDistribution command to do a file analyse. But the 
{{ArrayIndexOutOfBoundsException}} happened and lead the process terminated. 
The stack infos:
{code}
Exception in thread "main" java.lang.ArrayIndexOutOfBoundsException: 103
        at 
org.apache.hadoop.hdfs.tools.offlineImageViewer.FileDistributionCalculator.run(FileDistributionCalculator.java:243)
        at 
org.apache.hadoop.hdfs.tools.offlineImageViewer.FileDistributionCalculator.visit(FileDistributionCalculator.java:176)
        at 
org.apache.hadoop.hdfs.tools.offlineImageViewer.OfflineImageViewerPB.run(OfflineImageViewerPB.java:176)
        at 
org.apache.hadoop.hdfs.tools.offlineImageViewer.OfflineImageViewerPB.main(OfflineImageViewerPB.java:129)
{code}
I looked into the code and I found the exception was happened in increasing 
count of {{distribution}}. And the reason for the exception is that the bucket 
number was more than the distribution's length.

Here are my steps:
1).The input command params:
{code}
hdfs oiv -p FileDistribution -maxSize 104857600 -step 1024000
{code}
The {{numIntervals}} in code should be 104857600/1024000 =102(real 
value:102.4), so the {{distribution}}'s length should be {{numIntervals}} + 1 = 
103.
2).The {{ArrayIndexOutOfBoundsException}} will happens when the file size is in 
range [{{maxSize-steps}}, maxSize]. For example, if the size of one file is 
104400000, and it's in range of size as mentioned before. And the bucket number 
is calculated as 104800000/1024000=102.3, then in code we do the {{Math.ceil}} 
of this, so the final value should be 103. But the {{distribution}}'s length is 
also 103, it means the index is from 0 to 102. So the 
{{ArrayIndexOutOfBoundsException}} happens.

The exception will happens when {{maxSize}} can not be divided by {{step}}. The 
related logic should be changed from 
{code}
int bucket = fileSize > maxSize ? distribution.length - 1 : (int) Math
            .ceil((double)fileSize / steps);
{code}
to 
{code}
        int bucket =
            fileSize >= maxSize || (fileSize + steps) > maxSize ? 
distribution.length - 1
                : (int) Math.ceil((double) fileSize / steps);
{code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: hdfs-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: hdfs-dev-h...@hadoop.apache.org

Reply via email to