Thanks Alex, it is about java libs. I will try to follow both Raghava's and
your approach. I wanted to be able to run the job from Eclipse, it seems
that your one is better suited for that.

On Thu, Apr 22, 2010 at 6:28 PM, Alex Kozlov <[email protected]> wrote:

> Hi Farhan,
>
> Are you talking about java libs (jar) or native libs (.so, etc)?
>
> *Jars:*
>
> You can just jar it with your jar file, just put it in a lib subdirectory
> of
> your jar root directory
>
> *Native:
>
> *Put them into $HADOOP_HOME/lib/native/$PLATFORM/ on each node in the
> cluster
>
> where PLATFORM is the string returned by `hadoop
> org.apache.hadoop.util.PlatformName`
>
> There is a way to distribute native libs runtime, but it's more involved.
>
> Alex K
>
> On Thu, Apr 22, 2010 at 4:04 PM, Raghava Mutharaju <
> [email protected]> wrote:
>
> > Hello Farhan,
> >
> >        I use an external library and I run the MR job from command line.
> So
> > I specify it in -libjars as follows
> >
> > hadoop jar (my jar) (my class) -libjars (external jar) (args for my
> class)
> >
> > Raghava.
> >
> > On Thu, Apr 22, 2010 at 6:21 PM, Farhan Husain <
> [email protected]
> > >wrote:
> >
> > > Hello guys,
> > >
> > > Can you please tell me how I can use external libraries which my jobs
> > link
> > > to in a MapReduce job? I added the following lines in mapred-site.xml
> in
> > > all
> > > my nodes and put the external library jars in the specified directory
> but
> > I
> > > am getting ClassNotFoundException:
> > >
> > > <property>
> > >  <name>mapred.child.java.opts</name>
> > >  <value>-Xmx512m -Djava.library.path=/hadoop/Hadoop/userlibs</value>
> > > </property>
> > >
> > > Am I doing anything wrong? Is there any other way to solve my problem?
> > >
> > > Thanks,
> > > Farhan
> > >
> >
>

Reply via email to