Hello Hadoopers!

I have just recently started using Hadoop and I have a question that has puzzled me for a couple of days now.

I have already browsed the mailing list and found some relevant posts, especially http://mail-archives.apache.org/mod_mbox/lucene-hadoop-user/200708.mbox/[EMAIL PROTECTED], but the solution eludes me.

My Map/Reduce job relies on external jars and I had to modify my ant script to include them in the lib/ directory of my jar file. So far, so good. The job runs without any issues when I issue the job on my local machine only.

However, adding a second machine to the mini-cluster presents the following problem: a NullPointerException being thrown as soon as I call any function within a class I have imported from the external jars. Please note that this will only happen on the other machine, the maps on my main machine, which I submit the job on, will proceed without any warnings.

java.lang.NullPointerException at xxx.xxx.xxx (Unknown Source) is the actual log output from hadoop.

My jar file contains all the necessary jars in the lib/ directory. Do I need to place them somewhere else on the slaves in order for my submitted job to be able to use them?

Any pointers would be much appreciated.




Reply via email to