[jira] [Commented] (HIVE-13065) Hive throws NPE when writing map type data to a HBase backed table
[ https://issues.apache.org/jira/browse/HIVE-13065?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15151100#comment-15151100 ] Aihua Xu commented on HIVE-13065: - The patch looks good. +1. > Hive throws NPE when writing map type data to a HBase backed table > -- > > Key: HIVE-13065 > URL: https://issues.apache.org/jira/browse/HIVE-13065 > Project: Hive > Issue Type: Bug > Components: HBase Handler >Affects Versions: 1.1.0, 2.0.0 >Reporter: Yongzhi Chen >Assignee: Yongzhi Chen > Attachments: HIVE-13065.1.patch > > > Hive throws NPE when writing data to a HBase backed table with below > conditions: > # There is a map type column > # The map type column has NULL in its values > Below are the reproduce steps: > *1) Create a HBase backed Hive table* > {code:sql} > create table hbase_test (id bigint, data map) > stored by 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' > with serdeproperties ("hbase.columns.mapping" = ":key,cf:map_col") > tblproperties ("hbase.table.name" = "hive_test"); > {code} > *2) insert data into above table* > {code:sql} > insert overwrite table hbase_test select 1 as id, map('abcd', null) as data > from src limit 1; > {code} > The mapreduce job for insert query fails. Error messages are as below: > {noformat} > 2016-02-15 02:26:33,225 WARN [main] org.apache.hadoop.mapred.YarnChild: > Exception running child : java.lang.RuntimeException: > org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while > processing row (tag=0) {"key":{},"value":{"_col0":1,"_col1":{"abcd":null}}} > at > org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:265) > at > org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:444) > at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392) > at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:415) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671) > at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) > Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime > Error while processing row (tag=0) > {"key":{},"value":{"_col0":1,"_col1":{"abcd":null}}} > at > org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:253) > ... 7 more > Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: > org.apache.hadoop.hive.serde2.SerDeException: java.lang.NullPointerException > at > org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:731) > at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815) > at > org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84) > at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815) > at > org.apache.hadoop.hive.ql.exec.LimitOperator.processOp(LimitOperator.java:51) > at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815) > at > org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84) > at > org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:244) > ... 7 more > Caused by: org.apache.hadoop.hive.serde2.SerDeException: > java.lang.NullPointerException > at > org.apache.hadoop.hive.hbase.HBaseSerDe.serialize(HBaseSerDe.java:286) > at > org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:666) > ... 14 more > Caused by: java.lang.NullPointerException > at > org.apache.hadoop.hive.serde2.lazy.LazyUtils.writePrimitiveUTF8(LazyUtils.java:221) > at > org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:236) > at > org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:275) > at > org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:222) > at > org.apache.hadoop.hive.hbase.HBaseRowSerializer.serializeField(HBaseRowSerializer.java:194) > at > org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:118) > at > org.apache.hadoop.hive.hbase.HBaseSerDe.serialize(HBaseSerDe.java:282) > ... 15 more > {noformat} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (HIVE-13065) Hive throws NPE when writing map type data to a HBase backed table
[ https://issues.apache.org/jira/browse/HIVE-13065?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15150670#comment-15150670 ] Yongzhi Chen commented on HIVE-13065: - The failures are not related. > Hive throws NPE when writing map type data to a HBase backed table > -- > > Key: HIVE-13065 > URL: https://issues.apache.org/jira/browse/HIVE-13065 > Project: Hive > Issue Type: Bug > Components: HBase Handler >Affects Versions: 1.1.0, 2.0.0 >Reporter: Yongzhi Chen >Assignee: Yongzhi Chen > Attachments: HIVE-13065.1.patch > > > Hive throws NPE when writing data to a HBase backed table with below > conditions: > # There is a map type column > # The map type column has NULL in its values > Below are the reproduce steps: > *1) Create a HBase backed Hive table* > {code:sql} > create table hbase_test (id bigint, data map) > stored by 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' > with serdeproperties ("hbase.columns.mapping" = ":key,cf:map_col") > tblproperties ("hbase.table.name" = "hive_test"); > {code} > *2) insert data into above table* > {code:sql} > insert overwrite table hbase_test select 1 as id, map('abcd', null) as data > from src limit 1; > {code} > The mapreduce job for insert query fails. Error messages are as below: > {noformat} > 2016-02-15 02:26:33,225 WARN [main] org.apache.hadoop.mapred.YarnChild: > Exception running child : java.lang.RuntimeException: > org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while > processing row (tag=0) {"key":{},"value":{"_col0":1,"_col1":{"abcd":null}}} > at > org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:265) > at > org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:444) > at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392) > at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:415) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671) > at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) > Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime > Error while processing row (tag=0) > {"key":{},"value":{"_col0":1,"_col1":{"abcd":null}}} > at > org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:253) > ... 7 more > Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: > org.apache.hadoop.hive.serde2.SerDeException: java.lang.NullPointerException > at > org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:731) > at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815) > at > org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84) > at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815) > at > org.apache.hadoop.hive.ql.exec.LimitOperator.processOp(LimitOperator.java:51) > at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815) > at > org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84) > at > org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:244) > ... 7 more > Caused by: org.apache.hadoop.hive.serde2.SerDeException: > java.lang.NullPointerException > at > org.apache.hadoop.hive.hbase.HBaseSerDe.serialize(HBaseSerDe.java:286) > at > org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:666) > ... 14 more > Caused by: java.lang.NullPointerException > at > org.apache.hadoop.hive.serde2.lazy.LazyUtils.writePrimitiveUTF8(LazyUtils.java:221) > at > org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:236) > at > org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:275) > at > org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:222) > at > org.apache.hadoop.hive.hbase.HBaseRowSerializer.serializeField(HBaseRowSerializer.java:194) > at > org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:118) > at > org.apache.hadoop.hive.hbase.HBaseSerDe.serialize(HBaseSerDe.java:282) > ... 15 more > {noformat} -- This message was sent by Atlassian JIRA (v6.3.4#6332)
[jira] [Commented] (HIVE-13065) Hive throws NPE when writing map type data to a HBase backed table
[ https://issues.apache.org/jira/browse/HIVE-13065?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15150433#comment-15150433 ] Hive QA commented on HIVE-13065: Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12788108/HIVE-13065.1.patch {color:green}SUCCESS:{color} +1 due to 1 test(s) being added or modified. {color:red}ERROR:{color} -1 due to 5 failed/errored test(s), 9790 tests executed *Failed tests:* {noformat} TestSparkCliDriver-timestamp_lazy.q-bucketsortoptimize_insert_4.q-date_udf.q-and-12-more - did not produce a TEST-*.xml file org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_ivyDownload org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_partition_coltype_literals org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_authorization_uri_import org.apache.hive.jdbc.TestSSL.testSSLVersion {noformat} Test results: http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/7008/testReport Console output: http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/7008/console Test logs: http://ec2-174-129-184-35.compute-1.amazonaws.com/logs/PreCommit-HIVE-TRUNK-Build-7008/ Messages: {noformat} Executing org.apache.hive.ptest.execution.TestCheckPhase Executing org.apache.hive.ptest.execution.PrepPhase Executing org.apache.hive.ptest.execution.ExecutionPhase Executing org.apache.hive.ptest.execution.ReportingPhase Tests exited with: TestsFailedException: 5 tests failed {noformat} This message is automatically generated. ATTACHMENT ID: 12788108 - PreCommit-HIVE-TRUNK-Build > Hive throws NPE when writing map type data to a HBase backed table > -- > > Key: HIVE-13065 > URL: https://issues.apache.org/jira/browse/HIVE-13065 > Project: Hive > Issue Type: Bug > Components: HBase Handler >Affects Versions: 1.1.0, 2.0.0 >Reporter: Yongzhi Chen >Assignee: Yongzhi Chen > Attachments: HIVE-13065.1.patch > > > Hive throws NPE when writing data to a HBase backed table with below > conditions: > # There is a map type column > # The map type column has NULL in its values > Below are the reproduce steps: > *1) Create a HBase backed Hive table* > {code:sql} > create table hbase_test (id bigint, data map) > stored by 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' > with serdeproperties ("hbase.columns.mapping" = ":key,cf:map_col") > tblproperties ("hbase.table.name" = "hive_test"); > {code} > *2) insert data into above table* > {code:sql} > insert overwrite table hbase_test select 1 as id, map('abcd', null) as data > from src limit 1; > {code} > The mapreduce job for insert query fails. Error messages are as below: > {noformat} > 2016-02-15 02:26:33,225 WARN [main] org.apache.hadoop.mapred.YarnChild: > Exception running child : java.lang.RuntimeException: > org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while > processing row (tag=0) {"key":{},"value":{"_col0":1,"_col1":{"abcd":null}}} > at > org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:265) > at > org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:444) > at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392) > at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:415) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671) > at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) > Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime > Error while processing row (tag=0) > {"key":{},"value":{"_col0":1,"_col1":{"abcd":null}}} > at > org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:253) > ... 7 more > Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: > org.apache.hadoop.hive.serde2.SerDeException: java.lang.NullPointerException > at > org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:731) > at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815) > at > org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84) > at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815) > at > org.apache.hadoop.hive.ql.exec.LimitOperator.processOp(LimitOperator.java:51) > at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815) > at > org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84) > at > org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:244) >
[jira] [Commented] (HIVE-13065) Hive throws NPE when writing map type data to a HBase backed table
[ https://issues.apache.org/jira/browse/HIVE-13065?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15149531#comment-15149531 ] Yibing Shi commented on HIVE-13065: --- How about the reading part? If we skip the null values, would it affect the reading part? And what if we have a null value in key set? This is possible in theory. > Hive throws NPE when writing map type data to a HBase backed table > -- > > Key: HIVE-13065 > URL: https://issues.apache.org/jira/browse/HIVE-13065 > Project: Hive > Issue Type: Bug > Components: HBase Handler >Affects Versions: 1.1.0, 2.0.0 >Reporter: Yongzhi Chen >Assignee: Yongzhi Chen > Attachments: HIVE-13065.1.patch > > > Hive throws NPE when writing data to a HBase backed table with below > conditions: > # There is a map type column > # The map type column has NULL in its values > Below are the reproduce steps: > *1) Create a HBase backed Hive table* > {code:sql} > create table hbase_test (id bigint, data map) > stored by 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' > with serdeproperties ("hbase.columns.mapping" = ":key,cf:map_col") > tblproperties ("hbase.table.name" = "hive_test"); > {code} > *2) insert data into above table* > {code:sql} > insert overwrite table hbase_test select 1 as id, map('abcd', null) as data > from src limit 1; > {code} > The mapreduce job for insert query fails. Error messages are as below: > {noformat} > 2016-02-15 02:26:33,225 WARN [main] org.apache.hadoop.mapred.YarnChild: > Exception running child : java.lang.RuntimeException: > org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while > processing row (tag=0) {"key":{},"value":{"_col0":1,"_col1":{"abcd":null}}} > at > org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:265) > at > org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:444) > at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392) > at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:415) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671) > at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) > Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime > Error while processing row (tag=0) > {"key":{},"value":{"_col0":1,"_col1":{"abcd":null}}} > at > org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:253) > ... 7 more > Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: > org.apache.hadoop.hive.serde2.SerDeException: java.lang.NullPointerException > at > org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:731) > at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815) > at > org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84) > at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815) > at > org.apache.hadoop.hive.ql.exec.LimitOperator.processOp(LimitOperator.java:51) > at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815) > at > org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84) > at > org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:244) > ... 7 more > Caused by: org.apache.hadoop.hive.serde2.SerDeException: > java.lang.NullPointerException > at > org.apache.hadoop.hive.hbase.HBaseSerDe.serialize(HBaseSerDe.java:286) > at > org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:666) > ... 14 more > Caused by: java.lang.NullPointerException > at > org.apache.hadoop.hive.serde2.lazy.LazyUtils.writePrimitiveUTF8(LazyUtils.java:221) > at > org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:236) > at > org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:275) > at > org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:222) > at > org.apache.hadoop.hive.hbase.HBaseRowSerializer.serializeField(HBaseRowSerializer.java:194) > at > org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:118) > at > org.apache.hadoop.hive.hbase.HBaseSerDe.serialize(HBaseSerDe.java:282) > ... 15 more > {noformat} -- This message was sent by Atlassian JIRA (v6.3.4#6332)