Hi, Did you checked your hadoop cluster? Are you able to run hadoop jobs successfully? Because, the query 'Select * from table1;' will fetch the information directly from the metastore and will not be executed as a series of map/reduce jobs.
While running a query with some filters,say 'select id from table1;' will produce several map reduce jobs which will be executed in the hadoop cluster. So kindly check your cluster status before proceeding. Also kindly tell me which ,metastore are you using for Hive? i.e MySql or Derby. ~~Cheers.. 2011/3/23 王世森 <[email protected]> > Hi, > > > > Thanks for your reply. Here is the result of ‘Describe tables;’ > > > > *hive> Describe table1;* > > *OK* > > *id int* > > *name string* > > *Time taken: 18.104 seconds* > > > > Jack > > > > *发件人:* sangeetha s [mailto:[email protected]] > *发送时间:* 2011年3月23日 16:40 > *收件人:* [email protected] > *主题:* Re: Return code 2 from org.apache.hadoop.hive.ql.exec.ExecDRiver > error > > > > Hi, > > Did you checked the field names in the table properly? Actually from the > log file it is clear that there is no element named id in the table > 'table1'. Kindly check if there is any typo. Also the alias is not required > if you are dealing with a single table with simple queries. > > Kindly execute 'Describe table1;' to check the fields. > > ~~Cheers > > 2011/3/22 幻 <[email protected]> > > 你试过这样查么? > > select a.id from table1 a; > > 在 2011年3月22日 下午4:31,王世森 <[email protected]>写道: > > > > Hi, > > > > My hive version is 0.6.0, I can query data like this: select * from table1; > > > > OK > > 1 wss > > 2 chenliang > > Time taken: 7.366 seconds > > > > But when the sql is ‘select id from table1;’, hive throws a exception: > > > > 2011-03-22 14:12:46,612 Stage-1 map = 0%, reduce = 0% > > 2011-03-22 14:13:13,984 Stage-1 map = 100%, reduce = 100% > > Ended Job = job_201103221411_0001 with errors > > > > FAILED: Execution Error, return code 2 from > org.apache.hadoop.hive.ql.exec.ExecDriver > > > > Here is the hadoop log: > > java.lang.RuntimeException: java.util.NoSuchElementException > > at > org.apache.hadoop.hive.ql.exec.Utilities.getMapRedWork(Utilities.java:168) > > at > org.apache.hadoop.hive.ql.io.HiveInputFormat.init(HiveInputFormat.java:235) > > at > org.apache.hadoop.hive.ql.io.HiveInputFormat.initColumnsNeeded(HiveInputFormat.java:311) > > at > org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(HiveInputFormat.java:217) > > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:338) > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307) > > at org.apache.hadoop.mapred.Child.main(Child.java:170) > > Caused by: java.util.NoSuchElementException > > at java.util.Vector.lastElement(Vector.java:456) > > at com.sun.beans.ObjectHandler.lastExp(ObjectHandler.java:134) > > at > com.sun.beans.ObjectHandler.dequeueResult(ObjectHandler.java:138) > > at java.beans.XMLDecoder.readObject(XMLDecoder.java:201) > > at > org.apache.hadoop.hive.ql.exec.Utilities.deserializeMapRedWork(Utilities.java:409) > > at > org.apache.hadoop.hive.ql.exec.Utilities.getMapRedWork(Utilities.java:160) > > ... 6 more > > > > So,what’s the problem? > > > > > > Thanks very much > > > > Jack > > > > > > > > > > > -- > > > > Regards, > Sangita > -- Regards, Sangita
