Yes, everything's readable by everyone. As I said before, the odd thing is that running one of the example jobs like Wordcount work just fine.
-- Chris On Thu, Feb 14, 2013 at 2:17 PM, Keith Turner <[email protected]> wrote: > On Thu, Feb 14, 2013 at 1:53 PM, Chris Sigman <[email protected]> wrote: > > Yep, all of the jars are also available on the datanodes > > Also are the jars readable by the user running the M/R job? > > > > > > > -- > > Chris > > > > > > On Thu, Feb 14, 2013 at 1:51 PM, Billie Rinaldi <[email protected]> > wrote: > >> > >> On Thu, Feb 14, 2013 at 10:41 AM, Chris Sigman <[email protected]> > wrote: > >>> > >>> Hi everyone, > >>> > >>> I've got a job I'm running that I can't figure out why it's failing. > >>> I've tried running jobs from the examples, and they work just fine. > I'm > >>> running the job via > >>> > >>> > ./bin/tool.sh ~/MovingAverage.jar movingaverage.MAJob inst namenode > >>> > root pass stockdata movingaverage > >>> > >>> which I see is running the following exec call that seems perfect to > me: > >>> > >>> exec /usr/lib/hadoop/bin/hadoop jar /MovingAverage.jar > >>> movingaverage.MAJob -libjars > >>> > "/opt/accumulo/lib/libthrift-0.6.1.jar,/opt/accumulo/lib/accumulo-core-1.4.2.jar,/usr/lib/zookeeper//zookeeper-3.3.5-cdh3u5.jar,/opt/accumulo/lib/cloudtrace-1.4.2.jar,/opt/accumulo/lib/commons-collections-3.2.jar,/opt/accumulo/lib/commons-configuration-1.5.jar,/opt/accumulo/lib/commons-io-1.4.jar,/opt/accumulo/lib/commons-jci-core-1.0.jar,/opt/accumulo/lib/commons-jci-fam-1.0.jar,/opt/accumulo/lib/commons-lang-2.4.jar,/opt/accumulo/lib/commons-logging-1.0.4.jar,/opt/accumulo/lib/commons-logging-api-1.0.4.jar" > >>> inst namenode root pass tmpdatatable movingaverage > >> > >> > >> Does /opt/accumulo/lib/accumulo-core-1.4.2.jar exist on your hadoop > nodes, > >> specifically the one that's running the map? > >> > >> Billie > >> > >> > >>> > >>> > >>> but when the job runs, it gets to the map phase and fails: > >>> > >>> 13/02/14 13:25:26 INFO mapred.JobClient: Task Id : > >>> attempt_201301171408_0293_m_000000_0, Status : FAILED > >>> java.lang.RuntimeException: java.lang.ClassNotFoundException: > >>> org.apache.accumulo.core.client.mapreduce.AccumuloInputFormat > >>> at > >>> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1004) > >>> at > >>> > org.apache.hadoop.mapreduce.JobContext.getInputFormatClass(JobContext.java:205) > >>> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:606) > >>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323) > >>> at org.apache.hadoop.mapred.Child$4.run(Child.java:266) > >>> at java.security.AccessController.doPrivileged(Native Method) > >>> at javax.security.auth.Subject.doAs(Subject.java:415) > >>> at > >>> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1278) > >>> at org.apache.hadoop.mapred.Child.main(Child.java:260) > >>> Caused by: java.lang.ClassNotFoundException: > >>> org.apache.accumulo.core.client.mapreduce.AccumuloInputFormat > >>> at java.net.URLClassLoader$1.run(URLClassLoader.java:366) > >>> at java.net.URLClassLoader$1.run(URLClassLoader.java:355) > >>> at java.security.AccessController.doPrivileged(Native Method) > >>> > >>> I've also tried hacking it to work by adding the accumulo-core jar to > >>> hadoop's lib dir, but that doesn't seem to work either. > >>> > >>> Thanks for any help, > >>> -- > >>> Chris > >> > >> > > >
