[ https://issues.apache.org/jira/browse/HIVE-5970?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13840730#comment-13840730 ]
Eric Chu commented on HIVE-5970: -------------------------------- Looking at the code where the exception occurs, It seems like the array unpackedPatch could be empty after SerializationUtils.readInts(), but there's no checking when accessing the array. // unpack the patch blob long[] unpackedPatch = new long[pl]; SerializationUtils.readInts(unpackedPatch, 0, pl, pw + pgw, input); // apply the patch directly when decoding the packed data int patchIdx = 0; long currGap = 0; long currPatch = 0; currGap = unpackedPatch[patchIdx] >>> pw; > ArrayIndexOutOfBoundsException in RunLengthIntegerReaderV2.java > --------------------------------------------------------------- > > Key: HIVE-5970 > URL: https://issues.apache.org/jira/browse/HIVE-5970 > Project: Hive > Issue Type: Bug > Components: File Formats > Affects Versions: 0.12.0 > Reporter: Eric Chu > Priority: Critical > Labels: orcfile > > A workload involving ORC tables starts getting the following > ArrayIndexOutOfBoundsException AFTER the upgrade to Hive 0.12. The file is > added as part of HIVE-4123. > 2013-12-04 14:42:08,537 ERROR > cause:java.io.IOException: java.io.IOException: > java.lang.ArrayIndexOutOfBoundsException: 0 > 2013-12-04 14:42:08,537 WARN org.apache.hadoop.mapred.Child: Error running > child > java.io.IOException: java.io.IOException: > java.lang.ArrayIndexOutOfBoundsException: 0 > at > org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderNextException(HiveIOExceptionHandlerChain.java:121) > at > org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderNextException(HiveIOExceptionHandlerUtil.java:77) > at > org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.doNextWithExceptionHandler(HadoopShimsSecure.java:304) > at > org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.next(HadoopShimsSecure.java:220) > at > org.apache.hadoop.mapred.MapTask$TrackedRecordReader.moveToNext(MapTask.java:215) > at > org.apache.hadoop.mapred.MapTask$TrackedRecordReader.next(MapTask.java:200) > at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:48) > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:417) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332) > at org.apache.hadoop.mapred.Child$4.run(Child.java:268) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:396) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408) > at org.apache.hadoop.mapred.Child.main(Child.java:262) > Caused by: java.io.IOException: java.lang.ArrayIndexOutOfBoundsException: 0 > at > org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderNextException(HiveIOExceptionHandlerChain.java:121) > at > org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderNextException(HiveIOExceptionHandlerUtil.java:77) > at > org.apache.hadoop.hive.ql.io.HiveContextAwareRecordReader.doNext(HiveContextAwareRecordReader.java:276) > at > org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.doNext(CombineHiveRecordReader.java:101) > at > org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.doNext(CombineHiveRecordReader.java:41) > at > org.apache.hadoop.hive.ql.io.HiveContextAwareRecordReader.next(HiveContextAwareRecordReader.java:108) > at > org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.doNextWithExceptionHandler(HadoopShimsSecure.java:302) > ... 11 more > Caused by: java.lang.ArrayIndexOutOfBoundsException: 0 > at > org.apache.hadoop.hive.ql.io.orc.RunLengthIntegerReaderV2.readPatchedBaseValues(RunLengthIntegerReaderV2.java:171) > at > org.apache.hadoop.hive.ql.io.orc.RunLengthIntegerReaderV2.readValues(RunLengthIntegerReaderV2.java:54) > at > org.apache.hadoop.hive.ql.io.orc.RunLengthIntegerReaderV2.next(RunLengthIntegerReaderV2.java:287) > at > org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl$LongTreeReader.next(RecordReaderImpl.java:473) > at > org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl$StructTreeReader.next(RecordReaderImpl.java:1157) > at > org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.next(RecordReaderImpl.java:2196) > at > org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$OrcRecordReader.next(OrcInputFormat.java:129) > at > org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$OrcRecordReader.next(OrcInputFormat.java:80) > at > org.apache.hadoop.hive.ql.io.HiveContextAwareRecordReader.doNext(HiveContextAwareRecordReader.java:274) > ... 15 more -- This message was sent by Atlassian JIRA (v6.1#6144)