Hi,

You can add your dependencies in the lib folder of your main jar. Hadoop
will automatically distribute them to the cluster.

You can also explore using DistributedCache or -libjars options.
Thanks and Regards,
Sonal
www.meghsoft.com


On Mon, Apr 19, 2010 at 7:54 PM, Gang Luo <[email protected]> wrote:

> Hi all,
> this is kind of a java problem. I was using a package. In an example
> program, I import the package by "-classpath" when compiling it and pack it
> into a jar. When I execute my jar file, I need to also import the original
> package like this "java -classpath package.jar:myExecutable.jar myClass".
> Otherwise, it will report classnotfound exception. However, when run a
> program in hadoop, I cannot import more than one jar files (bin/hadoop jar
> myExecutable.jar myClass). How to impart that package.jar? I try "export
> CLASSPATH=...", it doesn't help.
>
> Thanks,
> -Gang
>
>
>
>

Reply via email to