[jira] [Updated] (HIVE-13083) Writing HiveDecimal to ORC can wrongly suppress present stream

2016-03-07 Thread Prasanth Jayachandran (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-13083?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Prasanth Jayachandran updated HIVE-13083:
-
Fix Version/s: 2.0.1

> Writing HiveDecimal to ORC can wrongly suppress present stream
> --
>
> Key: HIVE-13083
> URL: https://issues.apache.org/jira/browse/HIVE-13083
> Project: Hive
>  Issue Type: Bug
>Affects Versions: 0.13.0, 0.14.0, 1.0.0, 1.2.0, 1.1.0, 1.3.0, 2.0.0, 2.1.0
>Reporter: Yi Zhang
>Assignee: Prasanth Jayachandran
> Fix For: 1.3.0, 2.1.0, 2.0.1
>
> Attachments: HIVE-13083-branch-1.patch, HIVE-13083.1.patch, 
> HIVE-13083.2.patch, HIVE-13083.3.patch, HIVE-13083.4.patch, 
> HIVE-13083.4.patch, HIVE-13083.5.patch
>
>
> HIVE-3976 can cause ORC file to be unreadable. The changes introduced in 
> HIVE-3976 for DecimalTreeWriter can create null values after updating the 
> isPresent stream. 
> https://github.com/apache/hive/blob/branch-0.13/ql/src/java/org/apache/hadoop/hive/ql/io/orc/WriterImpl.java#L1337
> As result of the above return statement, isPresent stream state can become 
> wrong. The isPresent stream thinks all values are non-null and hence 
> suppressed. But the data stream will be of 0 length. When reading such files 
> we will get the following exception
> {code}
> Caused by: java.io.EOFException: Reading BigInteger past EOF from compressed 
> stream Stream for column 3 kind DATA position: 0 length: 0 range: 0 offset: 0 
> limit: 0
> at 
> org.apache.hadoop.hive.ql.io.orc.SerializationUtils.readBigInteger(SerializationUtils.java:176)
> at 
> org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$DecimalTreeReader.next(TreeReaderFactory.java:1264)
> at 
> org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$StructTreeReader.next(TreeReaderFactory.java:2004)
> at 
> org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.next(RecordReaderImpl.java:1039)
> ... 24 more
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HIVE-13083) Writing HiveDecimal to ORC can wrongly suppress present stream

2016-03-07 Thread Prasanth Jayachandran (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-13083?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Prasanth Jayachandran updated HIVE-13083:
-
Attachment: HIVE-13083.5.patch
HIVE-13083-branch-1.patch

Latest master and branch-1 patches that got committed.

> Writing HiveDecimal to ORC can wrongly suppress present stream
> --
>
> Key: HIVE-13083
> URL: https://issues.apache.org/jira/browse/HIVE-13083
> Project: Hive
>  Issue Type: Bug
>Affects Versions: 0.13.0, 0.14.0, 1.0.0, 1.2.0, 1.1.0, 1.3.0, 2.0.0, 2.1.0
>Reporter: Yi Zhang
>Assignee: Prasanth Jayachandran
> Fix For: 1.3.0, 2.1.0
>
> Attachments: HIVE-13083-branch-1.patch, HIVE-13083.1.patch, 
> HIVE-13083.2.patch, HIVE-13083.3.patch, HIVE-13083.4.patch, 
> HIVE-13083.4.patch, HIVE-13083.5.patch
>
>
> HIVE-3976 can cause ORC file to be unreadable. The changes introduced in 
> HIVE-3976 for DecimalTreeWriter can create null values after updating the 
> isPresent stream. 
> https://github.com/apache/hive/blob/branch-0.13/ql/src/java/org/apache/hadoop/hive/ql/io/orc/WriterImpl.java#L1337
> As result of the above return statement, isPresent stream state can become 
> wrong. The isPresent stream thinks all values are non-null and hence 
> suppressed. But the data stream will be of 0 length. When reading such files 
> we will get the following exception
> {code}
> Caused by: java.io.EOFException: Reading BigInteger past EOF from compressed 
> stream Stream for column 3 kind DATA position: 0 length: 0 range: 0 offset: 0 
> limit: 0
> at 
> org.apache.hadoop.hive.ql.io.orc.SerializationUtils.readBigInteger(SerializationUtils.java:176)
> at 
> org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$DecimalTreeReader.next(TreeReaderFactory.java:1264)
> at 
> org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$StructTreeReader.next(TreeReaderFactory.java:2004)
> at 
> org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.next(RecordReaderImpl.java:1039)
> ... 24 more
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HIVE-13083) Writing HiveDecimal to ORC can wrongly suppress present stream

2016-03-07 Thread Prasanth Jayachandran (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-13083?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Prasanth Jayachandran updated HIVE-13083:
-
Attachment: (was: HIVE-13083-branch-1.patch)

> Writing HiveDecimal to ORC can wrongly suppress present stream
> --
>
> Key: HIVE-13083
> URL: https://issues.apache.org/jira/browse/HIVE-13083
> Project: Hive
>  Issue Type: Bug
>Affects Versions: 0.13.0, 0.14.0, 1.0.0, 1.2.0, 1.1.0, 1.3.0, 2.0.0, 2.1.0
>Reporter: Yi Zhang
>Assignee: Prasanth Jayachandran
> Attachments: HIVE-13083.1.patch, HIVE-13083.2.patch, 
> HIVE-13083.3.patch, HIVE-13083.4.patch, HIVE-13083.4.patch
>
>
> HIVE-3976 can cause ORC file to be unreadable. The changes introduced in 
> HIVE-3976 for DecimalTreeWriter can create null values after updating the 
> isPresent stream. 
> https://github.com/apache/hive/blob/branch-0.13/ql/src/java/org/apache/hadoop/hive/ql/io/orc/WriterImpl.java#L1337
> As result of the above return statement, isPresent stream state can become 
> wrong. The isPresent stream thinks all values are non-null and hence 
> suppressed. But the data stream will be of 0 length. When reading such files 
> we will get the following exception
> {code}
> Caused by: java.io.EOFException: Reading BigInteger past EOF from compressed 
> stream Stream for column 3 kind DATA position: 0 length: 0 range: 0 offset: 0 
> limit: 0
> at 
> org.apache.hadoop.hive.ql.io.orc.SerializationUtils.readBigInteger(SerializationUtils.java:176)
> at 
> org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$DecimalTreeReader.next(TreeReaderFactory.java:1264)
> at 
> org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$StructTreeReader.next(TreeReaderFactory.java:2004)
> at 
> org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.next(RecordReaderImpl.java:1039)
> ... 24 more
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HIVE-13083) Writing HiveDecimal to ORC can wrongly suppress present stream

2016-02-23 Thread Prasanth Jayachandran (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-13083?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Prasanth Jayachandran updated HIVE-13083:
-
Attachment: HIVE-13083.4.patch

Reuploading the patch to trigger precommit test run.

> Writing HiveDecimal to ORC can wrongly suppress present stream
> --
>
> Key: HIVE-13083
> URL: https://issues.apache.org/jira/browse/HIVE-13083
> Project: Hive
>  Issue Type: Bug
>Affects Versions: 0.13.0, 0.14.0, 1.0.0, 1.2.0, 1.1.0, 1.3.0, 2.0.0, 2.1.0
>Reporter: Yi Zhang
>Assignee: Prasanth Jayachandran
> Attachments: HIVE-13083-branch-1.patch, HIVE-13083.1.patch, 
> HIVE-13083.2.patch, HIVE-13083.3.patch, HIVE-13083.4.patch, HIVE-13083.4.patch
>
>
> HIVE-3976 can cause ORC file to be unreadable. The changes introduced in 
> HIVE-3976 for DecimalTreeWriter can create null values after updating the 
> isPresent stream. 
> https://github.com/apache/hive/blob/branch-0.13/ql/src/java/org/apache/hadoop/hive/ql/io/orc/WriterImpl.java#L1337
> As result of the above return statement, isPresent stream state can become 
> wrong. The isPresent stream thinks all values are non-null and hence 
> suppressed. But the data stream will be of 0 length. When reading such files 
> we will get the following exception
> {code}
> Caused by: java.io.EOFException: Reading BigInteger past EOF from compressed 
> stream Stream for column 3 kind DATA position: 0 length: 0 range: 0 offset: 0 
> limit: 0
> at 
> org.apache.hadoop.hive.ql.io.orc.SerializationUtils.readBigInteger(SerializationUtils.java:176)
> at 
> org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$DecimalTreeReader.next(TreeReaderFactory.java:1264)
> at 
> org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$StructTreeReader.next(TreeReaderFactory.java:2004)
> at 
> org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.next(RecordReaderImpl.java:1039)
> ... 24 more
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HIVE-13083) Writing HiveDecimal to ORC can wrongly suppress present stream

2016-02-22 Thread Prasanth Jayachandran (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-13083?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Prasanth Jayachandran updated HIVE-13083:
-
Attachment: HIVE-13083-branch-1.patch

> Writing HiveDecimal to ORC can wrongly suppress present stream
> --
>
> Key: HIVE-13083
> URL: https://issues.apache.org/jira/browse/HIVE-13083
> Project: Hive
>  Issue Type: Bug
>Affects Versions: 0.13.0, 0.14.0, 1.0.0, 1.2.0, 1.1.0, 1.3.0, 2.0.0, 2.1.0
>Reporter: Yi Zhang
>Assignee: Prasanth Jayachandran
> Attachments: HIVE-13083-branch-1.patch, HIVE-13083.1.patch, 
> HIVE-13083.2.patch, HIVE-13083.3.patch, HIVE-13083.4.patch
>
>
> HIVE-3976 can cause ORC file to be unreadable. The changes introduced in 
> HIVE-3976 for DecimalTreeWriter can create null values after updating the 
> isPresent stream. 
> https://github.com/apache/hive/blob/branch-0.13/ql/src/java/org/apache/hadoop/hive/ql/io/orc/WriterImpl.java#L1337
> As result of the above return statement, isPresent stream state can become 
> wrong. The isPresent stream thinks all values are non-null and hence 
> suppressed. But the data stream will be of 0 length. When reading such files 
> we will get the following exception
> {code}
> Caused by: java.io.EOFException: Reading BigInteger past EOF from compressed 
> stream Stream for column 3 kind DATA position: 0 length: 0 range: 0 offset: 0 
> limit: 0
> at 
> org.apache.hadoop.hive.ql.io.orc.SerializationUtils.readBigInteger(SerializationUtils.java:176)
> at 
> org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$DecimalTreeReader.next(TreeReaderFactory.java:1264)
> at 
> org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$StructTreeReader.next(TreeReaderFactory.java:2004)
> at 
> org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.next(RecordReaderImpl.java:1039)
> ... 24 more
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HIVE-13083) Writing HiveDecimal to ORC can wrongly suppress present stream

2016-02-22 Thread Prasanth Jayachandran (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-13083?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Prasanth Jayachandran updated HIVE-13083:
-
Attachment: (was: HIVE-13083-branch-1.patch)

> Writing HiveDecimal to ORC can wrongly suppress present stream
> --
>
> Key: HIVE-13083
> URL: https://issues.apache.org/jira/browse/HIVE-13083
> Project: Hive
>  Issue Type: Bug
>Affects Versions: 0.13.0, 0.14.0, 1.0.0, 1.2.0, 1.1.0, 1.3.0, 2.0.0, 2.1.0
>Reporter: Yi Zhang
>Assignee: Prasanth Jayachandran
> Attachments: HIVE-13083-branch-1.patch, HIVE-13083.1.patch, 
> HIVE-13083.2.patch, HIVE-13083.3.patch, HIVE-13083.4.patch
>
>
> HIVE-3976 can cause ORC file to be unreadable. The changes introduced in 
> HIVE-3976 for DecimalTreeWriter can create null values after updating the 
> isPresent stream. 
> https://github.com/apache/hive/blob/branch-0.13/ql/src/java/org/apache/hadoop/hive/ql/io/orc/WriterImpl.java#L1337
> As result of the above return statement, isPresent stream state can become 
> wrong. The isPresent stream thinks all values are non-null and hence 
> suppressed. But the data stream will be of 0 length. When reading such files 
> we will get the following exception
> {code}
> Caused by: java.io.EOFException: Reading BigInteger past EOF from compressed 
> stream Stream for column 3 kind DATA position: 0 length: 0 range: 0 offset: 0 
> limit: 0
> at 
> org.apache.hadoop.hive.ql.io.orc.SerializationUtils.readBigInteger(SerializationUtils.java:176)
> at 
> org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$DecimalTreeReader.next(TreeReaderFactory.java:1264)
> at 
> org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$StructTreeReader.next(TreeReaderFactory.java:2004)
> at 
> org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.next(RecordReaderImpl.java:1039)
> ... 24 more
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HIVE-13083) Writing HiveDecimal to ORC can wrongly suppress present stream

2016-02-22 Thread Prasanth Jayachandran (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-13083?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Prasanth Jayachandran updated HIVE-13083:
-
Attachment: HIVE-13083.4.patch

> Writing HiveDecimal to ORC can wrongly suppress present stream
> --
>
> Key: HIVE-13083
> URL: https://issues.apache.org/jira/browse/HIVE-13083
> Project: Hive
>  Issue Type: Bug
>Affects Versions: 0.13.0, 0.14.0, 1.0.0, 1.2.0, 1.1.0, 1.3.0, 2.0.0, 2.1.0
>Reporter: Yi Zhang
>Assignee: Prasanth Jayachandran
> Attachments: HIVE-13083-branch-1.patch, HIVE-13083.1.patch, 
> HIVE-13083.2.patch, HIVE-13083.3.patch, HIVE-13083.4.patch
>
>
> HIVE-3976 can cause ORC file to be unreadable. The changes introduced in 
> HIVE-3976 for DecimalTreeWriter can create null values after updating the 
> isPresent stream. 
> https://github.com/apache/hive/blob/branch-0.13/ql/src/java/org/apache/hadoop/hive/ql/io/orc/WriterImpl.java#L1337
> As result of the above return statement, isPresent stream state can become 
> wrong. The isPresent stream thinks all values are non-null and hence 
> suppressed. But the data stream will be of 0 length. When reading such files 
> we will get the following exception
> {code}
> Caused by: java.io.EOFException: Reading BigInteger past EOF from compressed 
> stream Stream for column 3 kind DATA position: 0 length: 0 range: 0 offset: 0 
> limit: 0
> at 
> org.apache.hadoop.hive.ql.io.orc.SerializationUtils.readBigInteger(SerializationUtils.java:176)
> at 
> org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$DecimalTreeReader.next(TreeReaderFactory.java:1264)
> at 
> org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$StructTreeReader.next(TreeReaderFactory.java:2004)
> at 
> org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.next(RecordReaderImpl.java:1039)
> ... 24 more
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HIVE-13083) Writing HiveDecimal to ORC can wrongly suppress present stream

2016-02-18 Thread Prasanth Jayachandran (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-13083?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Prasanth Jayachandran updated HIVE-13083:
-
Attachment: HIVE-13083-branch-1.patch

> Writing HiveDecimal to ORC can wrongly suppress present stream
> --
>
> Key: HIVE-13083
> URL: https://issues.apache.org/jira/browse/HIVE-13083
> Project: Hive
>  Issue Type: Bug
>Affects Versions: 0.13.0, 0.14.0, 1.0.0, 1.2.0, 1.1.0, 1.3.0, 2.0.0, 2.1.0
>Reporter: Yi Zhang
>Assignee: Prasanth Jayachandran
> Attachments: HIVE-13083-branch-1.patch, HIVE-13083.1.patch, 
> HIVE-13083.2.patch
>
>
> HIVE-3976 can cause ORC file to be unreadable. The changes introduced in 
> HIVE-3976 for DecimalTreeWriter can create null values after updating the 
> isPresent stream. 
> https://github.com/apache/hive/blob/branch-0.13/ql/src/java/org/apache/hadoop/hive/ql/io/orc/WriterImpl.java#L1337
> As result of the above return statement, isPresent stream state can become 
> wrong. The isPresent stream thinks all values are non-null and hence 
> suppressed. But the data stream will be of 0 length. When reading such files 
> we will get the following exception
> {code}
> Caused by: java.io.EOFException: Reading BigInteger past EOF from compressed 
> stream Stream for column 3 kind DATA position: 0 length: 0 range: 0 offset: 0 
> limit: 0
> at 
> org.apache.hadoop.hive.ql.io.orc.SerializationUtils.readBigInteger(SerializationUtils.java:176)
> at 
> org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$DecimalTreeReader.next(TreeReaderFactory.java:1264)
> at 
> org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$StructTreeReader.next(TreeReaderFactory.java:2004)
> at 
> org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.next(RecordReaderImpl.java:1039)
> ... 24 more
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HIVE-13083) Writing HiveDecimal to ORC can wrongly suppress present stream

2016-02-18 Thread Prasanth Jayachandran (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-13083?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Prasanth Jayachandran updated HIVE-13083:
-
Attachment: (was: HIVE-13083-branch-1.patch)

> Writing HiveDecimal to ORC can wrongly suppress present stream
> --
>
> Key: HIVE-13083
> URL: https://issues.apache.org/jira/browse/HIVE-13083
> Project: Hive
>  Issue Type: Bug
>Affects Versions: 0.13.0, 0.14.0, 1.0.0, 1.2.0, 1.1.0, 1.3.0, 2.0.0, 2.1.0
>Reporter: Yi Zhang
>Assignee: Prasanth Jayachandran
> Attachments: HIVE-13083-branch-1.patch, HIVE-13083.1.patch, 
> HIVE-13083.2.patch
>
>
> HIVE-3976 can cause ORC file to be unreadable. The changes introduced in 
> HIVE-3976 for DecimalTreeWriter can create null values after updating the 
> isPresent stream. 
> https://github.com/apache/hive/blob/branch-0.13/ql/src/java/org/apache/hadoop/hive/ql/io/orc/WriterImpl.java#L1337
> As result of the above return statement, isPresent stream state can become 
> wrong. The isPresent stream thinks all values are non-null and hence 
> suppressed. But the data stream will be of 0 length. When reading such files 
> we will get the following exception
> {code}
> Caused by: java.io.EOFException: Reading BigInteger past EOF from compressed 
> stream Stream for column 3 kind DATA position: 0 length: 0 range: 0 offset: 0 
> limit: 0
> at 
> org.apache.hadoop.hive.ql.io.orc.SerializationUtils.readBigInteger(SerializationUtils.java:176)
> at 
> org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$DecimalTreeReader.next(TreeReaderFactory.java:1264)
> at 
> org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$StructTreeReader.next(TreeReaderFactory.java:2004)
> at 
> org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.next(RecordReaderImpl.java:1039)
> ... 24 more
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HIVE-13083) Writing HiveDecimal to ORC can wrongly suppress present stream

2016-02-18 Thread Prasanth Jayachandran (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-13083?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Prasanth Jayachandran updated HIVE-13083:
-
Attachment: HIVE-13083.3.patch

Updated some golden files are writer version change

> Writing HiveDecimal to ORC can wrongly suppress present stream
> --
>
> Key: HIVE-13083
> URL: https://issues.apache.org/jira/browse/HIVE-13083
> Project: Hive
>  Issue Type: Bug
>Affects Versions: 0.13.0, 0.14.0, 1.0.0, 1.2.0, 1.1.0, 1.3.0, 2.0.0, 2.1.0
>Reporter: Yi Zhang
>Assignee: Prasanth Jayachandran
> Attachments: HIVE-13083-branch-1.patch, HIVE-13083.1.patch, 
> HIVE-13083.2.patch, HIVE-13083.3.patch
>
>
> HIVE-3976 can cause ORC file to be unreadable. The changes introduced in 
> HIVE-3976 for DecimalTreeWriter can create null values after updating the 
> isPresent stream. 
> https://github.com/apache/hive/blob/branch-0.13/ql/src/java/org/apache/hadoop/hive/ql/io/orc/WriterImpl.java#L1337
> As result of the above return statement, isPresent stream state can become 
> wrong. The isPresent stream thinks all values are non-null and hence 
> suppressed. But the data stream will be of 0 length. When reading such files 
> we will get the following exception
> {code}
> Caused by: java.io.EOFException: Reading BigInteger past EOF from compressed 
> stream Stream for column 3 kind DATA position: 0 length: 0 range: 0 offset: 0 
> limit: 0
> at 
> org.apache.hadoop.hive.ql.io.orc.SerializationUtils.readBigInteger(SerializationUtils.java:176)
> at 
> org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$DecimalTreeReader.next(TreeReaderFactory.java:1264)
> at 
> org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$StructTreeReader.next(TreeReaderFactory.java:2004)
> at 
> org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.next(RecordReaderImpl.java:1039)
> ... 24 more
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HIVE-13083) Writing HiveDecimal to ORC can wrongly suppress present stream

2016-02-18 Thread Prasanth Jayachandran (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-13083?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Prasanth Jayachandran updated HIVE-13083:
-
Attachment: HIVE-13083.2.patch

Addressed [~gopalv]'s review comments. Also bumped up the writer version

> Writing HiveDecimal to ORC can wrongly suppress present stream
> --
>
> Key: HIVE-13083
> URL: https://issues.apache.org/jira/browse/HIVE-13083
> Project: Hive
>  Issue Type: Bug
>Affects Versions: 0.13.0, 0.14.0, 1.0.0, 1.2.0, 1.1.0, 1.3.0, 2.0.0, 2.1.0
>Reporter: Yi Zhang
>Assignee: Prasanth Jayachandran
> Attachments: HIVE-13083-branch-1.patch, HIVE-13083.1.patch, 
> HIVE-13083.2.patch
>
>
> HIVE-3976 can cause ORC file to be unreadable. The changes introduced in 
> HIVE-3976 for DecimalTreeWriter can create null values after updating the 
> isPresent stream. 
> https://github.com/apache/hive/blob/branch-0.13/ql/src/java/org/apache/hadoop/hive/ql/io/orc/WriterImpl.java#L1337
> As result of the above return statement, isPresent stream state can become 
> wrong. The isPresent stream thinks all values are non-null and hence 
> suppressed. But the data stream will be of 0 length. When reading such files 
> we will get the following exception
> {code}
> Caused by: java.io.EOFException: Reading BigInteger past EOF from compressed 
> stream Stream for column 3 kind DATA position: 0 length: 0 range: 0 offset: 0 
> limit: 0
> at 
> org.apache.hadoop.hive.ql.io.orc.SerializationUtils.readBigInteger(SerializationUtils.java:176)
> at 
> org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$DecimalTreeReader.next(TreeReaderFactory.java:1264)
> at 
> org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$StructTreeReader.next(TreeReaderFactory.java:2004)
> at 
> org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.next(RecordReaderImpl.java:1039)
> ... 24 more
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HIVE-13083) Writing HiveDecimal to ORC can wrongly suppress present stream

2016-02-17 Thread Prasanth Jayachandran (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-13083?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Prasanth Jayachandran updated HIVE-13083:
-
Reporter: Yi Zhang  (was: Prasanth Jayachandran)

> Writing HiveDecimal to ORC can wrongly suppress present stream
> --
>
> Key: HIVE-13083
> URL: https://issues.apache.org/jira/browse/HIVE-13083
> Project: Hive
>  Issue Type: Bug
>Affects Versions: 0.13.0, 0.14.0, 1.0.0, 1.2.0, 1.1.0, 1.3.0, 2.0.0, 2.1.0
>Reporter: Yi Zhang
>Assignee: Prasanth Jayachandran
> Attachments: HIVE-13083-branch-1.patch, HIVE-13083.1.patch
>
>
> HIVE-3976 can cause ORC file to be unreadable. The changes introduced in 
> HIVE-3976 for DecimalTreeWriter can create null values after updating the 
> isPresent stream. 
> https://github.com/apache/hive/blob/branch-0.13/ql/src/java/org/apache/hadoop/hive/ql/io/orc/WriterImpl.java#L1337
> As result of the above return statement, isPresent stream state can become 
> wrong. The isPresent stream thinks all values are non-null and hence 
> suppressed. But the data stream will be of 0 length. When reading such files 
> we will get the following exception
> {code}
> Caused by: java.io.EOFException: Reading BigInteger past EOF from compressed 
> stream Stream for column 3 kind DATA position: 0 length: 0 range: 0 offset: 0 
> limit: 0
> at 
> org.apache.hadoop.hive.ql.io.orc.SerializationUtils.readBigInteger(SerializationUtils.java:176)
> at 
> org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$DecimalTreeReader.next(TreeReaderFactory.java:1264)
> at 
> org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$StructTreeReader.next(TreeReaderFactory.java:2004)
> at 
> org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.next(RecordReaderImpl.java:1039)
> ... 24 more
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HIVE-13083) Writing HiveDecimal to ORC can wrongly suppress present stream

2016-02-17 Thread Prasanth Jayachandran (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-13083?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Prasanth Jayachandran updated HIVE-13083:
-
Attachment: HIVE-13083.1.patch

> Writing HiveDecimal to ORC can wrongly suppress present stream
> --
>
> Key: HIVE-13083
> URL: https://issues.apache.org/jira/browse/HIVE-13083
> Project: Hive
>  Issue Type: Bug
>Affects Versions: 0.13.0, 0.14.0, 1.0.0, 1.2.0, 1.1.0, 1.3.0, 2.0.0, 2.1.0
>Reporter: Prasanth Jayachandran
>Assignee: Prasanth Jayachandran
> Attachments: HIVE-13083-branch-1.patch, HIVE-13083.1.patch
>
>
> HIVE-3976 can cause ORC file to be unreadable. The changes introduced in 
> HIVE-3976 for DecimalTreeWriter can create null values after updating the 
> isPresent stream. 
> https://github.com/apache/hive/blob/branch-0.13/ql/src/java/org/apache/hadoop/hive/ql/io/orc/WriterImpl.java#L1337
> As result of the above return statement, isPresent stream state can become 
> wrong. The isPresent stream thinks all values are non-null and hence 
> suppressed. But the data stream will be of 0 length. When reading such files 
> we will get the following exception
> {code}
> Caused by: java.io.EOFException: Reading BigInteger past EOF from compressed 
> stream Stream for column 3 kind DATA position: 0 length: 0 range: 0 offset: 0 
> limit: 0
> at 
> org.apache.hadoop.hive.ql.io.orc.SerializationUtils.readBigInteger(SerializationUtils.java:176)
> at 
> org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$DecimalTreeReader.next(TreeReaderFactory.java:1264)
> at 
> org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$StructTreeReader.next(TreeReaderFactory.java:2004)
> at 
> org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.next(RecordReaderImpl.java:1039)
> ... 24 more
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HIVE-13083) Writing HiveDecimal to ORC can wrongly suppress present stream

2016-02-17 Thread Prasanth Jayachandran (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-13083?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Prasanth Jayachandran updated HIVE-13083:
-
Status: Patch Available  (was: Open)

> Writing HiveDecimal to ORC can wrongly suppress present stream
> --
>
> Key: HIVE-13083
> URL: https://issues.apache.org/jira/browse/HIVE-13083
> Project: Hive
>  Issue Type: Bug
>Affects Versions: 2.0.0, 1.1.0, 1.2.0, 1.0.0, 0.14.0, 0.13.0, 1.3.0, 2.1.0
>Reporter: Prasanth Jayachandran
>Assignee: Prasanth Jayachandran
> Attachments: HIVE-13083-branch-1.patch, HIVE-13083.1.patch
>
>
> HIVE-3976 can cause ORC file to be unreadable. The changes introduced in 
> HIVE-3976 for DecimalTreeWriter can create null values after updating the 
> isPresent stream. 
> https://github.com/apache/hive/blob/branch-0.13/ql/src/java/org/apache/hadoop/hive/ql/io/orc/WriterImpl.java#L1337
> As result of the above return statement, isPresent stream state can become 
> wrong. The isPresent stream thinks all values are non-null and hence 
> suppressed. But the data stream will be of 0 length. When reading such files 
> we will get the following exception
> {code}
> Caused by: java.io.EOFException: Reading BigInteger past EOF from compressed 
> stream Stream for column 3 kind DATA position: 0 length: 0 range: 0 offset: 0 
> limit: 0
> at 
> org.apache.hadoop.hive.ql.io.orc.SerializationUtils.readBigInteger(SerializationUtils.java:176)
> at 
> org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$DecimalTreeReader.next(TreeReaderFactory.java:1264)
> at 
> org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$StructTreeReader.next(TreeReaderFactory.java:2004)
> at 
> org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.next(RecordReaderImpl.java:1039)
> ... 24 more
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)


[jira] [Updated] (HIVE-13083) Writing HiveDecimal to ORC can wrongly suppress present stream

2016-02-17 Thread Prasanth Jayachandran (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-13083?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Prasanth Jayachandran updated HIVE-13083:
-
Attachment: HIVE-13083-branch-1.patch

> Writing HiveDecimal to ORC can wrongly suppress present stream
> --
>
> Key: HIVE-13083
> URL: https://issues.apache.org/jira/browse/HIVE-13083
> Project: Hive
>  Issue Type: Bug
>Affects Versions: 0.13.0, 0.14.0, 1.0.0, 1.2.0, 1.1.0, 1.3.0, 2.0.0, 2.1.0
>Reporter: Prasanth Jayachandran
>Assignee: Prasanth Jayachandran
> Attachments: HIVE-13083-branch-1.patch
>
>
> HIVE-3976 can cause ORC file to be unreadable. The changes introduced in 
> HIVE-3976 for DecimalTreeWriter can create null values after updating the 
> isPresent stream. 
> https://github.com/apache/hive/blob/branch-0.13/ql/src/java/org/apache/hadoop/hive/ql/io/orc/WriterImpl.java#L1337
> As result of the above return statement, isPresent stream state can become 
> wrong. The isPresent stream thinks all values are non-null and hence 
> suppressed. But the data stream will be of 0 length. When reading such files 
> we will get the following exception
> {code}
> Caused by: java.io.EOFException: Reading BigInteger past EOF from compressed 
> stream Stream for column 3 kind DATA position: 0 length: 0 range: 0 offset: 0 
> limit: 0
> at 
> org.apache.hadoop.hive.ql.io.orc.SerializationUtils.readBigInteger(SerializationUtils.java:176)
> at 
> org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$DecimalTreeReader.next(TreeReaderFactory.java:1264)
> at 
> org.apache.hadoop.hive.ql.io.orc.TreeReaderFactory$StructTreeReader.next(TreeReaderFactory.java:2004)
> at 
> org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.next(RecordReaderImpl.java:1039)
> ... 24 more
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)