[ 
https://issues.apache.org/jira/browse/HIVE-12378?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15004409#comment-15004409
 ] 

Yongzhi Chen commented on HIVE-12378:
-------------------------------------

Binary can not be null. This is consistent with other data types for hive hbase 
tables, for example if I tried to insert into test9 values (5, NULL); test9 
second column is string; or test1(second column is int)  I got similar 
exception:
{noformat}
URL:
  
http://ychencdh57-1.vpc.cloudera.com:8088/taskdetails.jsp?jobid=job_1447108763205_0022&tipid=task_1447108763205_0022_m_000000
-----
Diagnostic Messages for this Task:
Error: java.lang.RuntimeException: 
org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while 
processing row {"tmp_values_col1":"5","tmp_values_col2":null}
        at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:179)
        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error 
while processing row {"tmp_values_col1":"5","tmp_values_col2":null}
        at 
org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:507)
        at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:170)
        ... 8 more
Caused by: java.lang.IllegalArgumentException: No columns to insert
        at org.apache.hadoop.hbase.client.HTable.validatePut(HTable.java:1561)
        at 
org.apache.hadoop.hbase.client.BufferedMutatorImpl.validatePut(BufferedMutatorImpl.java:147)
        at 
org.apache.hadoop.hbase.client.BufferedMutatorImpl.doMutate(BufferedMutatorImpl.java:134)
        at 
org.apache.hadoop.hbase.client.BufferedMutatorImpl.mutate(BufferedMutatorImpl.java:98)
        at org.apache.hadoop.hbase.client.HTable.put(HTable.java:1105)
        at 
org.apache.hadoop.hive.hbase.HiveHBaseTableOutputFormat$MyRecordWriter.write(HiveHBaseTableOutputFormat.java:146)
        at 
org.apache.hadoop.hive.hbase.HiveHBaseTableOutputFormat$MyRecordWriter.write(HiveHBaseTableOutputFormat.java:117)
        at 
org.apache.hadoop.hive.ql.io.HivePassThroughRecordWriter.write(HivePassThroughRecordWriter.java:40)
        at 
org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:695)
        at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815)
        at 
org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84)
        at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815)
        at 
org.apache.hadoop.hive.ql.exec.TableScanOperator.processOp(TableScanOperator.java:95)
        at 
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:157)
        at 
org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:497)
        ... 9 more

{noformat}

following code is added because  I want LazyBioBinary is consistent LazyBinary.
In the LazyBinary.init method, it calls super.init(bytes, start, length) which 
is the LazyObject.init and it is the same code as following:
{noformat}
    if (bytes == null) {
      throw new RuntimeException("bytes cannot be null!");
    }
    this.isNull = false;
{noformat}






> Exception on HBaseSerDe.serialize binary field
> ----------------------------------------------
>
>                 Key: HIVE-12378
>                 URL: https://issues.apache.org/jira/browse/HIVE-12378
>             Project: Hive
>          Issue Type: Bug
>          Components: HBase Handler, Serializers/Deserializers
>    Affects Versions: 1.0.0, 1.1.0, 2.0.0
>            Reporter: Yongzhi Chen
>            Assignee: Yongzhi Chen
>         Attachments: HIVE-12378.1.patch
>
>
> An issue was reproduced with the binary typed HBase columns in Hive:
> It works fine as below:
> CREATE TABLE test9 (key int, val string)
> STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
> WITH SERDEPROPERTIES (
> "hbase.columns.mapping" = ":key,cf:val#b"
> );
> insert into test9 values(1,"hello");
> But when string type is changed to binary as:
> CREATE TABLE test2 (key int, val binary)
> STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
> WITH SERDEPROPERTIES (
> "hbase.columns.mapping" = ":key,cf:val#b"
> );
> insert into table test2 values(1, 'hello');
> The following exception is thrown:
> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime 
> Error while processing row {"tmp_values_col1":"1","tmp_values_col2":"hello"}
> ...
> Caused by: java.lang.RuntimeException: Hive internal error.
> at 
> org.apache.hadoop.hive.serde2.lazy.LazyUtils.writePrimitive(LazyUtils.java:322)
> at 
> org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:220)
> at 
> org.apache.hadoop.hive.hbase.HBaseRowSerializer.serializeField(HBaseRowSerializer.java:194)
> at 
> org.apache.hadoop.hive.hbase.HBaseRowSerializer.serialize(HBaseRowSerializer.java:118)
> at org.apache.hadoop.hive.hbase.HBaseSerDe.serialize(HBaseSerDe.java:282)
> ... 16 more
> We should support hive binary type column for hbase.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to