1. Wrap all your jar files inside your artifact, they should be under lib folder. Sometimes this could make your jar file quite big, if you want to save time uploading big jar files remotely, see 2 2. Use -libjars with full path or relative path (w.r.t. your jar package) should work

On 3/6/2012 9:55 AM, Ioan Eugen Stan wrote:
Pe 06.03.2012 17:37, Jane Wayne a scris:
currently, i have my main jar and then 2 depedent jars. what i do is
1. copy dependent-1.jar to $HADOOP/lib
2. copy dependent-2.jar to $HADOOP/lib

then, when i need to run my job, MyJob inside main.jar, i do the following.

hadoop jar main.jar demo.MyJob -libjars dependent-1.jar,dependent-2.jar
-Dmapred.input.dir=/input/path -Dmapred.output.dir=/output/path

what i want to do is NOT copy the dependent jars to $HADOOP/lib and always
specify -libjars. is there any way around this multi-step procedure? i
really do not want to clutter $HADOOP/lib or specify a comma-delimited list
of jars for -libjars.

any help is appreciated.


Hello,

Specify the full path to the jar on the -libjars? My experience with -libjars is that it didn't work as advertised.

Search for an older post on the list about this issue ( -libjars not working). I tried adding a lot of jars and some got on the job classpath (2), some didn't (most of them).

I got over this by including all the jars in a lib directory inside the main jar.

Cheers,

Reply via email to