i found depositing required jars into the lib directory works just great
(all those jars are prepended to the classpath by the hadoop script). 

Any flaws doing it this way?

-----Original Message-----
From: Eyal Oren [mailto:[EMAIL PROTECTED] 
Sent: Tuesday, August 14, 2007 12:45 AM
To: [email protected]
Subject: Re: Specifying external jars in the classpath for Hadoop

On 08/13/07/08/07 16:49 -0700, Phantom wrote:
>Hi
>
>I have a map/reduce job that uses external jar files. How do I specify
those
>jars in the classpath when submitting the mapred job using ./hadoop jar
....
>? Suppose my map job relies on API in some external jar how do I pass
this
>jar file as part of my job submission.
As far as I understand (that's what we do anyway), you have to submit
one 
jar that contains all your dependencies (except for dependencies on
hadoop 
libs), including external jars. The easiest is probably to build
maven/ant 
to build such "big" jar externally with all its dependency jars unpacked

and added into the jar, and then submit them to hadoop.

  -eyal

Reply via email to