Assuming you're using hadoop-1.0, then
   export HADOOP_USER_CLASSPATH_FIRST=true
for submit/client side to use your HADOOP_CLASS_PATH before frameworks' and
    -Dmapreduce.user.classpath.first=true
for tasks side to use your -libjars before frameworks'.

Koji



On 1/23/12 6:39 AM, "John Armstrong" <john.armstr...@ccri.com> wrote:

> On Fri, 13 Jan 2012 13:59:12 -0800 (PST), vvkbtnkr <vvkbt...@yahoo.com>
> wrote:
>> I am running a hadoop jar and keep getting this error -
>> java.lang.NoSuchMethodError:
>> org.codehaus.jackson.JsonParser.getValueAsLong()
> 
> Nobody seems to have answered this while I was on vacation, so...
> 
> Okay, here's what I know, having run into a similar problem before.
> Hadoop, at some point, decided that it wanted to use Jackson for JSON
> serialization as an alternative method of serialization for its
> Configuration objects.  Of course, this means that Hadoop now depends on
> Jackson, and in particular some specific version of Jackson.  In my own
> version (CDH3b3 = Hadoop 0.20.2+737), this is 1.5.2  The $HADOOP_HOME/lib
> directory in each of my tasktracker nodes contains:
> 
> jackson-core-asl-1.5.2.jar
> jackson-mapper-asl-1.5.2.jar
> 
> So?  So here's how the Hadoop classloader works: first it loads up
> anything in $HADOOP_HOME/lib, then it loads anything on your distributed
> classpath, including the specified job JAR (we're using an überJAR
> containing all of our dependencies).  But, like all classloaders...
> 
> THE FIRST VERSION OF A LOADED CLASS TAKES PRECEDENCE
> 
> This is Very Important for Java to work right, and should be the first
> thing any Java hacker unfortunate enough to have to understand classloaders
> should learn.
> 
> In your case, this means that you're trying to use the 1.8.x version of
> org.codehaus.jackson.JsonParser, but the 1.5.2 version has already been
> loaded, and that takes precedence.  Since JsonParser doesn't have a method
> called getValueAsLong() in Jackson 1.5.2, you get an error.
> 
> So, what do you do?  You've got a couple of options.  If you control your
> own deployment environment, that's great; you can try upgrading Hadoop's
> version of Jackson.  It's pretty good about backwards-compatibility, so you
> probably won't have any problems.  On the other hand, if you don't control
> your deployment environment (or you don't want to risk fiddling with it,
> which I Totally Get), you can try to rewrite your code to use Jackson
> version 1.5.2; that's what we did, and it's sort of awkward in some of the
> ways it does things, but I think for your purposes it probably won't be too
> difficult.  I think JsonParser.getLongValue() is the right method to
> consider.
> 
> HTH

Reply via email to