[
https://issues.apache.org/jira/browse/HIVE-13083?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Prasanth Jayachandran updated HIVE-13083:
-----------------------------------------
Attachment: HIVE-13083.1.patch
> Writing HiveDecimal to ORC can wrongly suppress present stream
> --------------------------------------------------------------
>
> Key: HIVE-13083
> URL: https://issues.apache.org/jira/browse/HIVE-13083
> Project: Hive
> Issue Type: Bug
> Affects Versions: 0.13.0, 0.14.0, 1.0.0, 1.2.0, 1.1.0, 1.3.0, 2.0.0, 2.1.0
> Reporter: Prasanth Jayachandran
> Assignee: Prasanth Jayachandran
> Attachments: HIVE-13083-branch-1.patch, HIVE-13083.1.patch
>
>
> HIVE-3976 can cause ORC file to be unreadable. The changes introduced in
> HIVE-3976 for DecimalTreeWriter can create null values after updating the
> isPresent stream.
> https://github.com/apache/hive/blob/branch-0.13/ql/src/java/org/apache/hadoop/hive/ql/io/orc/WriterImpl.java#L1337
> As result of the above return statement, isPresent stream state can become
> wrong. The isPresent stream thinks all values are non-null and hence
> suppressed. But the data stream will be of 0 length. When reading such files
> we will get the following exception
> {code}
> Caused by: java.io.EOFException: Reading BigInteger past EOF from compressed
> stream Stream for column 3 kind DATA position: 0 length: 0 range: 0 offset: 0
> limit: 0
> at
> org.apache.hadoop.hive.ql.io.orc.SerializationUtils.readBigInteger(SerializationUtils.java:176)
> at
> org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$DecimalTreeReader.next(TreeReaderFactory.java:1264)
> at
> org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$StructTreeReader.next(TreeReaderFactory.java:2004)
> at
> org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.next(RecordReaderImpl.java:1039)
> ... 24 more
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)