Ok, thanks.

On Tue, Apr 6, 2010 at 4:10 PM, Zheng Shao <[email protected]> wrote:

> Yes we use sun jdk 1.6 and it works.
>
> On Tue, Apr 6, 2010 at 12:32 PM, Aaron McCurry <[email protected]> wrote:
> > I am using 1.6, however it is the IBM jvm (not my choice).  If the
> feature
> > is known to work on the Sun JVM then I will deal with the problem another
> > way.  Thanks.
> >
> > Aaron
> >
> > On Tue, Apr 6, 2010 at 3:12 PM, Zheng Shao <[email protected]> wrote:
> >>
> >> Are you using Java 1.5? Hive now requires Java 1.6
> >>
> >>
> >> On Tue, Apr 6, 2010 at 7:23 AM, Aaron McCurry <[email protected]>
> wrote:
> >> > In the past I have used hive 0.3.0 successfully and now with a new
> >> > project
> >> > coming up I decided to give hive 0.5.0 a run and everything is working
> >> > as
> >> > expected, except for when I try to get a simple count of the table.
> >> >
> >> > The simple table is defined as:
> >> >
> >> > create table log_table (col1 string, col2 string, col3 string, col4
> >> > string,
> >> > col5 string, col6 string)
> >> > row format delimited
> >> > fields terminated by '\t'
> >> > stored as textfile;
> >> >
> >> > And the query I'm running is:
> >> >
> >> > select count(1) from log_table;
> >> >
> >> > From the hive command line I get the following errors:
> >> >
> >> > ...
> >> > In order to set c constant number of reducers:
> >> >    set mapred.reduce.tasks=<number>
> >> > Exception during encoding:java.lang.Exception: failed to write
> >> > expression:
> >> > GenericUDAFEvaluator$Mode=Class.new();
> >> > Continue...
> >> > Exception during encoding:java.lang.Exception: failed to write
> >> > expression:
> >> > GenericUDAFEvaluator$Mode=Class.new();
> >> > Continue...
> >> > Exception during encoding:java.lang.Exception: failed to write
> >> > expression:
> >> > GenericUDAFEvaluator$Mode=Class.new();
> >> > Continue...
> >> > Exception during encoding:java.lang.Exception: failed to write
> >> > expression:
> >> > GenericUDAFEvaluator$Mode=Class.new();
> >> > Continue...
> >> > Starting Job = job_201004010912_0015, Tracking URL = .....
> >> >
> >> >
> >> >
> >> > And when looking at the failed hadoop jobs I see the following
> >> > exception:
> >> >
> >> > Caused by: java.lang.ClassCastException:
> >> >
> >> >
> org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableIntObjectInspector
> >> > incompatible with
> >> >
> >> >
> org.apache.hadoop.hive.serde2.objectinspector.primitive.LongObjectInspector
> >> >     at
> >> >
> >> >
> org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount$GenericUDAFCountEvaluator.merge(GenericUDAFCount.java:93)
> >> >     at
> >> >
> >> >
> org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator.aggregate(GenericUDAFEvaluator.java:113)
> >> > ...
> >> >
> >> >
> >> > Is this a known issue?  Am I missing something?  Any guidance would be
> >> > appreciated.  Thanks!
> >> >
> >> > Aaron
> >> >
> >>
> >>
> >>
> >> --
> >> Yours,
> >> Zheng
> >> http://www.linkedin.com/in/zshao
> >
> >
>
>
>
> --
> Yours,
> Zheng
> http://www.linkedin.com/in/zshao
>

Reply via email to