On Fri, Aug 27, 2010 at 3:35 PM, Venkatesh <[email protected]> wrote:

>
>
>
>  The mapreduce job code I have (java app) depends on other libraries. It
> runs fine when the job is run locally
> & but when I'm running on a true distributed setup..it's failing on
> dependencies..Do I have to put all the
> libraies, propertty files (dependent) of my application in HADOOP_CLASSPATH
> ..for the mapreduce to run  in a cluster?
>
> thanks
> venkatesh
>
>
>
I believe you can pass additional jar libraries via the -libjars switch.

-- 
Thanks

Ben Campbell

Reply via email to