[
https://issues.apache.org/jira/browse/HIVE-10729?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14598804#comment-14598804
]
Greg Senia commented on HIVE-10729:
-----------------------------------
Gunther Hagleitner and Matt Mcline Using this Patch against my JIRA HIVE-11051
and the test case on Hadoop 2.4.1 with Hive 1.2.0 and Tez 0.5.4 it still fails:
Caused by: java.lang.RuntimeException:
org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while
processing row
{"cnctevn_id":"002246948195","svcrqst_id":"0000003629537980","svcrqst_crt_dts":"2015-04-24
12:48:37.859683","subject_seq_no":1,"plan_component":"HMOM1
","cust_segment":"RM
","cnctyp_cd":"001","cnctmd_cd":"D02","cnctevs_cd":"007","svcrtyp_cd":"335","svrstyp_cd":"088","cmpltyp_cd":"
","catsrsn_cd":" ","apealvl_cd":"
","cnstnty_cd":"001","svcrqst_asrqst_ind":"Y","svcrqst_rtnorig_in":"N","svcrqst_vwasof_dt":"null","sum_reason_cd":"98","sum_reason":"Exclude","crsr_master_claim_index":null,"svcrqct_cds":["
"],"svcrqst_lupdt":"2015-04-24
12:48:37.859683","crsr_lupdt":null,"cntevsds_lupdt":"2015-04-24
12:48:40.499238","ignore_me":1,"notes":null}
at
org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:91)
at
org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.pushRecord(MapRecordSource.java:68)
at
org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.run(MapRecordProcessor.java:290)
at
org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:148)
... 13 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error
while processing row
{"cnctevn_id":"002246948195","svcrqst_id":"0000003629537980","svcrqst_crt_dts":"2015-04-24
12:48:37.859683","subject_seq_no":1,"plan_component":"HMOM1
","cust_segment":"RM
","cnctyp_cd":"001","cnctmd_cd":"D02","cnctevs_cd":"007","svcrtyp_cd":"335","svrstyp_cd":"088","cmpltyp_cd":"
","catsrsn_cd":" ","apealvl_cd":"
","cnstnty_cd":"001","svcrqst_asrqst_ind":"Y","svcrqst_rtnorig_in":"N","svcrqst_vwasof_dt":"null","sum_reason_cd":"98","sum_reason":"Exclude","crsr_master_claim_index":null,"svcrqct_cds":["
"],"svcrqst_lupdt":"2015-04-24
12:48:37.859683","crsr_lupdt":null,"cntevsds_lupdt":"2015-04-24
12:48:40.499238","ignore_me":1,"notes":null}
at
org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:518)
at
org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:83)
... 16 more
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Unexpected
exception: Index: 0, Size: 0
at
org.apache.hadoop.hive.ql.exec.MapJoinOperator.process(MapJoinOperator.java:426)
at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:837)
at
org.apache.hadoop.hive.ql.exec.FilterOperator.process(FilterOperator.java:122)
at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:837)
at
org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:97)
at
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:162)
at
org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:508)
... 17 more
Caused by: java.lang.IndexOutOfBoundsException: Index: 0, Size: 0
at java.util.ArrayList.rangeCheck(ArrayList.java:635)
at java.util.ArrayList.set(ArrayList.java:426)
at
org.apache.hadoop.hive.ql.exec.persistence.MapJoinBytesTableContainer.fixupComplexObjects(MapJoinBytesTableContainer.java:424)
at
org.apache.hadoop.hive.ql.exec.persistence.HybridHashTableContainer$ReusableRowContainer.uppack(HybridHashTableContainer.java:875)
at
org.apache.hadoop.hive.ql.exec.persistence.HybridHashTableContainer$ReusableRowContainer.first(HybridHashTableContainer.java:845)
at
org.apache.hadoop.hive.ql.exec.persistence.HybridHashTableContainer$ReusableRowContainer.first(HybridHashTableContainer.java:722)
at
org.apache.hadoop.hive.ql.exec.persistence.UnwrapRowContainer.first(UnwrapRowContainer.java:62)
at
org.apache.hadoop.hive.ql.exec.persistence.UnwrapRowContainer.first(UnwrapRowContainer.java:33)
at
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genUniqueJoinObject(CommonJoinOperator.java:650)
at
org.apache.hadoop.hive.ql.exec.CommonJoinOperator.checkAndGenObject(CommonJoinOperator.java:756)
at
org.apache.hadoop.hive.ql.exec.MapJoinOperator.process(MapJoinOperator.java:414)
... 23 more
]], Vertex failed as one or more tasks failed. failedTasks:1, Vertex
vertex_1434641270368_13820_2_01 [Map 2] killed/failed due to:null]DAG failed
due to vertex failure. failedVertices:1 killedVertices:0
> Query failed when select complex columns from joinned table (tez map join
> only)
> -------------------------------------------------------------------------------
>
> Key: HIVE-10729
> URL: https://issues.apache.org/jira/browse/HIVE-10729
> Project: Hive
> Issue Type: Bug
> Components: Query Processor
> Affects Versions: 1.2.0
> Reporter: Selina Zhang
> Assignee: Matt McCline
> Attachments: HIVE-10729.03.patch, HIVE-10729.1.patch,
> HIVE-10729.2.patch
>
>
> When map join happens, if projection columns include complex data types,
> query will fail.
> Steps to reproduce:
> {code:sql}
> hive> set hive.auto.convert.join;
> hive.auto.convert.join=true
> hive> desc foo;
> a array<int>
> hive> select * from foo;
> [1,2]
> hive> desc src_int;
> key int
> value string
> hive> select * from src_int where key=2;
> 2 val_2
> hive> select * from foo join src_int src on src.key = foo.a[1];
> {code}
> Query will fail with stack trace
> {noformat}
> Caused by: java.lang.ClassCastException:
> org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryArray cannot be cast to
> [Ljava.lang.Object;
> at
> org.apache.hadoop.hive.serde2.objectinspector.StandardListObjectInspector.getList(StandardListObjectInspector.java:111)
> at
> org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.serialize(LazySimpleSerDe.java:314)
> at
> org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.serializeField(LazySimpleSerDe.java:262)
> at
> org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.doSerialize(LazySimpleSerDe.java:246)
> at
> org.apache.hadoop.hive.serde2.AbstractEncodingAwareSerDe.serialize(AbstractEncodingAwareSerDe.java:50)
> at
> org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:692)
> at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:837)
> at
> org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:88)
> at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:837)
> at
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.internalForward(CommonJoinOperator.java:644)
> at
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genAllOneUniqueJoinObject(CommonJoinOperator.java:676)
> at
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.checkAndGenObject(CommonJoinOperator.java:754)
> at
> org.apache.hadoop.hive.ql.exec.MapJoinOperator.process(MapJoinOperator.java:386)
> ... 23 more
> {noformat}
> Similar error when projection columns include a map:
> {code:sql}
> hive> CREATE TABLE test (a INT, b MAP<INT, STRING>) STORED AS ORC;
> hive> INSERT OVERWRITE TABLE test SELECT 1, MAP(1, "val_1", 2, "val_2") FROM
> src LIMIT 1;
> hive> select * from src join test where src.key=test.a;
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)