There is currently no way to get Hive version from the command line.

I will leave the other 2 questions to Ashish and Joydeep.

Zheng

On Sun, Jun 14, 2009 at 6:48 PM, Eva Tse <[email protected]> wrote:

>  Great thanks! We are using hive 0.3.0 based on what I know.
>
> BTW, is there a quick way to find out current hive version on command line
> like ‘hive –v’? That would be helpful.
>
> Also, when would 0.40 be released? And would it work with 0.20.x w/o a
> patch?
>
> Thanks for the info!
> Eva.
>
>
>
> On 6/14/09 6:21 PM, "Zheng Shao" <[email protected]> wrote:
>
> Hi Eva,
>
> Which version of Hive are you using? Is it Hive 0.3.0 release or Hive
> trunk?
>
> This bug is recently fixed in
> https://issues.apache.org/jira/browse/HIVE-495 It's not back-ported to
> Hive 0.3.0.
>
> I would suggest you to try hive svn trunk (I think it's time to make Hive
> 0.4.0 release from trunk soon).
>
> Zheng
>
> On Sun, Jun 14, 2009 at 5:34 PM, Eva Tse <[email protected]> wrote:
>
> We get this error when having “cast (tablename.properties['time'] as
> int)/3600.0” in the select clause of a hive query (with join). Is this a
> known problem/limitation?
> BTW, this only happens when this table is joined with another table in the
> query.
>
> Thanks,
> Eva.
>
> java.lang.ClassCastException: java.util.HashMap cannot be cast to
> org.apache.hadoop.hive.serde2.lazy.LazyMap
>     at
> org.apache.hadoop.hive.serde2.objectinspector.LazyMapObjectInspector.getMapValueElement(LazyMapObjectInspector.java:85)
>     at
> org.apache.hadoop.hive.ql.exec.ExprNodeIndexEvaluator.evaluate(ExprNodeIndexEvaluator.java:86)
>     at
> org.apache.hadoop.hive.ql.exec.ExprNodeFuncEvaluator.evaluate(ExprNodeFuncEvaluator.java:99)
>     at
> org.apache.hadoop.hive.ql.exec.ExprNodeFuncEvaluator.evaluate(ExprNodeFuncEvaluator.java:99)
>     at
> org.apache.hadoop.hive.ql.exec.ExprNodeFuncEvaluator.evaluate(ExprNodeFuncEvaluator.java:99)
>     at
> org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:67)
>     at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:383)
>     at
> org.apache.hadoop.hive.ql.exec.FilterOperator.process(FilterOperator.java:70)
>     at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:383)
>     at
> org.apache.hadoop.hive.ql.exec.JoinOperator.createForwardJoinObject(JoinOperator.java:298)
>     at
> org.apache.hadoop.hive.ql.exec.JoinOperator.genObject(JoinOperator.java:541)
>     at
> org.apache.hadoop.hive.ql.exec.JoinOperator.genObject(JoinOperator.java:530)
>     at
> org.apache.hadoop.hive.ql.exec.JoinOperator.genObject(JoinOperator.java:530)
>     at
> org.apache.hadoop.hive.ql.exec.JoinOperator.checkAndGenObject(JoinOperator.java:571)
>     at
> org.apache.hadoop.hive.ql.exec.JoinOperator.endGroup(JoinOperator.java:553)
>     at
> org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:145)
>     at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:430)
>     at org.apache.hadoop.mapred.Child.main(Child.java:155)
>
>
>
>


-- 
Yours,
Zheng

Reply via email to