Hi David,

Thanks for letting us know. I will take a look now.

In the meanwhile, there is a fix related to types
https://issues.apache.org/jira/browse/HIVE-624 which might solve the
problem.
You might want to try it out.

Zheng

On Fri, Jul 10, 2009 at 11:28 AM, David Lerman<[email protected]> wrote:
> Attempting to join three tables is consistently failing with a
> ClassCastException using Hive trunk (r792966) and Hadoop 0.18.3.
>
> The three tables are defined as follows:
>
> create table foo (foo_id int, foo_name string, foo_a string, foo_b string,
> foo_c string, foo_d string) row format delimited fields terminated by ','
> stored as textfile;
>
> create table bar (bar_id int, bar_0 int, foo_id int, bar_1 int, bar_name
> string, bar_a string, bar_b string, bar_c string, bar_d string) row format
> delimited fields terminated by ',' stored as textfile;
>
> create table count (bar_id int, n int) row format delimited fields
> terminated by ',' stored as textfile;
>
> Each table has a single row as follows:
>
> foo:
> 1,foo1,a,b,c,d
>
> bar:
> 10,0,1,1,bar10,a,b,c,d
>
> counts:
> 10,2
>
> The failing query is:
>
> select foo.foo_name, bar.bar_name, n from foo join bar on foo.foo_id =
> bar.foo_id join count on count.bar_id = bar.bar_id;
>
> Interestingly, the query works if you reorder the joins (select
> foo.foo_name, bar.bar_name, n from count join bar on count.bar_id =
> bar.bar_id join foo on foo.foo_id = bar.foo_id) or if you remove any of the
> unused string columns from foo or even just move the unused int columns in
> bar to the end.
>
> The exception is as follows:
>
> java.lang.ClassCastException:
> org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableIntObjectIns
> pector cannot be cast to
> org.apache.hadoop.hive.serde2.objectinspector.primitive.StringObjectInspecto
> r
>  at
> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeTypeString.serialize(
> DynamicSerDeTypeString.java:63)
>  at
> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeFieldList.serialize(D
> ynamicSerDeFieldList.java:249)
>  at
> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeStructBase.serialize(
> DynamicSerDeStructBase.java:81)
>  at
> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDe.serialize(DynamicSer
> De.java:177)
>  at
> org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.jav
> a:180)
>  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:492)
>  at
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.createForwardJoinObject(Co
> mmonJoinOperator.java:290)
>  at
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperat
> or.java:533)
>  at
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperat
> or.java:522)
>  at
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperat
> or.java:522)
>  at
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.checkAndGenObject(CommonJo
> inOperator.java:563)
>  at
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.endGroup(CommonJoinOperato
> r.java:545)
>  at org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:159)
>  at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:318)
>  at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2198)
> java.lang.ClassCastException:
> org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableIntObjectIns
> pector cannot be cast to
> org.apache.hadoop.hive.serde2.objectinspector.primitive.StringObjectInspecto
> r
>  at
> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeTypeString.serialize(
> DynamicSerDeTypeString.java:63)
>  at
> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeFieldList.serialize(D
> ynamicSerDeFieldList.java:249)
>  at
> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDeStructBase.serialize(
> DynamicSerDeStructBase.java:81)
>  at
> org.apache.hadoop.hive.serde2.dynamic_type.DynamicSerDe.serialize(DynamicSer
> De.java:177)
>  at
> org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.jav
> a:180)
>  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:492)
>  at
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.createForwardJoinObject(Co
> mmonJoinOperator.java:290)
>  at
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperat
> or.java:533)
>  at
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperat
> or.java:522)
>  at
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genObject(CommonJoinOperat
> or.java:522)
>  at
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.checkAndGenObject(CommonJo
> inOperator.java:563)
>  at
> org.apache.hadoop.hive.ql.exec.CommonJoinOperator.endGroup(CommonJoinOperato
> r.java:545)
>  at org.apache.hadoop.hive.ql.exec.ExecReducer.close(ExecReducer.java:236)
>  at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:329)
>  at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2198)
>
>
> Thanks for your help!
>
>



-- 
Yours,
Zheng

Reply via email to