I'm still not quite sure what is going on, but I'm reasonable certain
my issues stem from the way this particular cluster is set up (something
weird with hdfs going on).  I found a way to work around the issue by
copying my jars to a shared file system.  That way at least I don't have
to use a fat jar.

Thanks for you help!

--Thilo

On 06/07/2013 09:15 AM, Thilo Goetz wrote:
On 06/06/2013 08:09 PM, Shahab Yunus wrote:
It is trying to read the JSON4J.jar from local/home/hadoop. Does that
jar exist at this path on the client from which you are invoking it?
Does this jar in the current dir from which your are kicking off the job?

Yes and yes.  In fact, the job goes through because it runs fine on the
master, which is where I'm starting this.  It just fails on the slave.

So I'm starting the job in /local/home/hadoop on the master with this:

hadoop jar mrtest.jar my.MRTestJob -libjars JSON4J.jar in out

It doesn't matter if I use an absolute or relative path, file://
protocol or not, the result is always the same.



On Thu, Jun 6, 2013 at 1:33 PM, Thilo Goetz <[email protected]
<mailto:[email protected]>> wrote:

    On 06/06/2013 06:58 PM, Shahab Yunus wrote:

        Are you following the guidelines as mentioned here:
        http://grepalex.com/2013/02/__25/hadoop-libjars/
        <http://grepalex.com/2013/02/25/hadoop-libjars/>


    Now I am, so thanks for that :-)

    Still doesn't work though.  Following the hint in that
    post I looked at the job config, which has this:
    tmpjars file:/local/home/hadoop/__JSON4J.jar

    I assume that's the correct value.  Any other ideas?

    --Thilo


        Regards,
        Shahab


        On Thu, Jun 6, 2013 at 12:51 PM, Thilo Goetz <[email protected]
        <mailto:[email protected]>
        <mailto:[email protected] <mailto:[email protected]>>> wrote:

             Hi all,

             I'm using hadoop 1.0 (yes it's old, but there is nothing I
        can do
             about that).  I have some M/R programs what work
perfectly on a
             single node setup.  However, they consistently fail in the
        cluster
             I have available.  I have tracked this down to the fact
        that extra
             jars I include on the command line with -libjars are not
        available
             on the slaves.  I get FileNotFoundExceptions for those jars.

             For example, I run this:

             hadoop jar mrtest.jar my.MRTestJob -libjars JSON4J.jar in
out

             The I get (on the slave):

             java.io.FileNotFoundException: File
        /local/home/hadoop/JSON4J.jar
             does not exist.
                      at


org.apache.hadoop.fs.____RawLocalFileSystem.____getFileStatus(____RawLocalFileSystem.java:397)

                      at


org.apache.hadoop.fs.____FilterFileSystem.____getFileStatus(____FilterFileSystem.java:251)

                      at


org.apache.hadoop.filecache.____TaskDistributedCacheManager.____setupCache(TaskDistributedCac\

             heManager.java:179)
                      at


org.apache.hadoop.mapred.____TaskTracker$4.run(TaskTracker.____java:1193)
                      at


java.security.____AccessController.doPrivileged(____AccessController.java:284)

                      at
        javax.security.auth.Subject.____doAs(Subject.java:573)
                      at


org.apache.hadoop.security.____UserGroupInformation.doAs(____UserGroupInformation.java:____1128)

                      at


org.apache.hadoop.mapred.____TaskTracker.initializeJob(____TaskTracker.java:1184)

                      at


org.apache.hadoop.mapred.____TaskTracker.localizeJob(____TaskTracker.java:1099)

                      at


org.apache.hadoop.mapred.____TaskTracker$5.run(TaskTracker.____java:2382)
                      at java.lang.Thread.run(Thread.____java:736)

             Where /local/home/hadoop is where I ran the code on the
master.

             As far as I can tell from my internet research, this is
        supposed to
             work in hadoop 1.0, correct?  It may well be that the
        cluster is
             somehow misconfigured (didn't set it up myself), so I would
        appreciate
             any hints as to what I should be looking at in terms of
        configuration.

             Oh and btw, the fat jar approach where I put all classes
        required by
             the M/R code in the main jar works perfectly.  However, I
        would like
             to avoid that if I possibly can.

             Any help appreciated!

             --Thilo






Reply via email to