I am currently using hadoop 0.20.1. When I installed ant I did -Dhadoop.version="0.20.0" I will try and get you that output but my connection is rough right now and I need to do a little port forwarding. I will get it to you when i can.
On Thu, Jan 21, 2010 at 5:04 PM, Ning Zhang <[email protected]> wrote: > Which hadoop version are you using (you can see it by running 'hadoop > version')? Have you specified hadoop.version when you compile Hive? Are they > match? > > Also can you post the log file found through the Tracking URL when you > launch the job (in your example http://master:50030/jobdetails.jsp...)? > The log files can be found by click through the mappers/reducers and at the > rightmost column of the mapper/reducer stats. > > > On Jan 21, 2010, at 1:37 PM, John Villa wrote: > > I did not see any hive jar files within hadoop lib directory. I compiled it > using ant and the basic readme. > > On Thu, Jan 21, 2010 at 4:17 PM, Ning Zhang <[email protected]> wrote: > >> It may be because you have 2 copies of Hive's jar files (e.g., one in >> Hadoop's lib directory and one in Hive's lib directory) and they are from >> different releases. If you have compiled Hive trunk yoruself please make >> sure no Hive's jar files in the Hadoop's directory. >> >> >> On Jan 21, 2010, at 12:51 PM, John Villa wrote: >> >> Here is what I got from the hive log; >> 2010-01-21 20:49:13,291 ERROR DataNucleus.Plugin >> (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires >> "org.eclipse.core.resources" but it cannot be resolved. >> 2010-01-21 20:49:13,291 ERROR DataNucleus.Plugin >> (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires >> "org.eclipse.core.resources" but it cannot be resolved. >> 2010-01-21 20:49:13,294 ERROR DataNucleus.Plugin >> (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires >> "org.eclipse.core.runtime" but it cannot be resolved. >> 2010-01-21 20:49:13,294 ERROR DataNucleus.Plugin >> (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires >> "org.eclipse.core.runtime" but it cannot be resolved. >> 2010-01-21 20:49:13,294 ERROR DataNucleus.Plugin >> (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires >> "org.eclipse.text" but it cannot be resolved. >> 2010-01-21 20:49:13,294 ERROR DataNucleus.Plugin >> (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires >> "org.eclipse.text" but it cannot be resolved. >> 2010-01-21 20:49:18,719 WARN mapred.JobClient >> (JobClient.java:configureCommandLineOptions(539)) - Use GenericOptionsParser >> for parsing the arguments. Applications should implement Tool for the same. >> 2010-01-21 20:49:55,514 ERROR exec.ExecDriver >> (SessionState.java:printError(279)) - Ended Job = job_201001211744_0008 with >> errors >> 2010-01-21 20:49:55,550 ERROR ql.Driver >> (SessionState.java:printError(279)) - FAILED: Execution Error, return code 2 >> from org.apache.hadoop.hive.ql.exec.ExecDriver >> >> >> On Thu, Jan 21, 2010 at 3:37 PM, John Villa <[email protected]>wrote: >> >>> Has anyone seen this error? Any help is appreciated, thanks; >>> >>> hive> select * from apachelog where host ="64.62.191.114"; >>> Total MapReduce jobs = 1 >>> Number of reduce tasks is set to 0 since there's no reduce operator >>> Starting Job = job_201001211744_0002, Tracking URL = >>> http://master:50030/jobdetails.jsp?jobid=job_201001211744_0002 >>> Kill Command = /u01/hadoop/bin/../bin/hadoop job >>> -Dmapred.job.tracker=master:54311 -kill job_201001211744_0002 >>> 2010-01-21 07:24:38,198 map = 0%, reduce =0% >>> 2010-01-21 07:25:15,029 map = 100%, reduce =100% >>> Ended Job = job_201001211744_0002 with errors >>> FAILED: Execution Error, return code 2 from >>> org.apache.hadoop.hive.ql.exec.ExecDriver >>> >> >> >> > >
