I printed out my classpath in the configure function of the mapper and reducer it looks like the jars in /usr/lib/hadoop/lib are still appearing first. So I must not be correctly setting the option to make my classpath first.
Any ideas what I might be doing wrong? J On Sun, Jul 15, 2012 at 11:34 PM, Jeremy Lewi <[email protected]> wrote: > Thanks Alan. > > I'm still getting the same error as before. Here's how I'm running the job > * > > - > > * > hadoop jar ./target/contrail-1.0-SNAPSHOT-job.jar > contrail.avro.QuickMergeAvro -D mapreduce.task.classpath.first=true > -libjars=/users/jlewi/svn_avro_1.6.1/lang/java/avro/target/avro-1.6.1.jar,/users/jlewi/svn_avro_1.6.1/lang/java/mapred/target/avro-mapred-1.6.1.jar > --inputpath=/users/jlewi/staph/assembly/BuildGraph > --outputpath=/users/jlewi/staph/assembly/QuickMerge --K=45 > > I verified via the job tracker that the property > "mapreduce.task.classpath.first" is getting picked up. > > It looks like the problem I'm dealing with is related to > https://issues.apache.org/jira/browse/AVRO-1103. > > Any ideas? > > Thanks > J > > On Sun, Jul 15, 2012 at 2:00 AM, Alan Miller <[email protected]>wrote: > >> Hi Just a quick idea. >> Also check ALL directories returned by >> hadoop classpath >> for any Avro related classes. >> >> I was struggling trying to use >> avro-1.7.0 with CDH4 but made it work >> by using the -libjars option and making sure my classes are used BEFORE >> the standard classes. There's a config >> property (dont remember) to set for >> that. Note the above setting is for the >> task's classpath, to control the >> classpath of your driver class set >> HADOOOP_CLASSPATH=... and >> HADOOOP_USER_CLASSPATH_FIRST=true >> >> Alan >> >> Sent from my iPhone >> >> On Jul 15, 2012, at 3:59, "Jeremy Lewi" <[email protected]> wrote: >> >> > hi avro-users, >> > >> > I'm getting the following exception when using avro 1.6.1 with CDH4. >> > java.lang.NoSuchMethodError: >> org.apache.avro.specific.SpecificData.deepCopy(Lorg/apache/avro/Schema;Ljava/lang/Object;)Ljava/lang/Object; >> > >> > The offending code is >> > GraphNodeData copy = (GraphNodeData) >> SpecificData.get().deepCopy(data.getSchema(), data); >> > >> > where GraphNodeData is a class generated from my AVRO record. >> > >> > The code runs just fine on CDH3. I tried rebuilding AVRO from source >> and installing it my local repo because of a previous post that said Avro >> 1.6.1 in maven had been built against CDH3. I also deleted all the avro jar >> files I found in >> > /usr/lib/hadoop >> > >> > Any ideas? Thanks? >> > Jeremy >> > >> > >> > >> > >
