Hey Chang, The code examples are present in the src/examples/pipes/ sub directory inside hadoop. There is a README file there which helps you through the examples.
Thanks, Sandeep On Sun, 2008-02-24 at 12:14 -0500, Chang Hu wrote: > Folks, thank you very much for your response. My comments below: > > Sandeep - > > >You can easily use hadoop pipes for this particular problem . Have you > >tried them ? > > Unfortunately, no. I googled and found no example on using hadoop pipes. > The javadoc is helpful but didn't have an example, either. Is there > anywhere I can find an example? > > Jason - > > Thanks. I am trying this approach. > > Alejandro - > > Thanks. I am using 0.15.0 right now, so I'll have to do something like > Jason mentioned. > > > - Chang > > On Sun, Feb 24, 2008 at 6:10 AM, Alejandro Abdelnur <[EMAIL PROTECTED]> > wrote: > > > If you add the native libraries to the distributed cache on the job > > working directory they will be picked up for the job. no need to > > modify anything in hadoop. > > > > This functionality was introduced with in HADOOP-1660 > > (https://issues.apache.org/jira/browse/HADOOP-1660). > > > > You need to add the lib to the distributed cache for the job and then > > created a symlink on it. > > > > A > > > > > > On Sun, Feb 24, 2008 at 1:24 PM, Jason Venner <[EMAIL PROTECTED]> > > wrote: > > > For our JNI tasks, we just install our libraries in the same places on > > > each machine, and set the LD_LIBRARY_PATH to include these directories, > > > in hadoop-env.sh. > > > > > > This does require a restart of the mapred portion of the cluster if you > > > need to change the LD_LIBRARY_PATH. > > > > > > Chang Hu wrote: > > > > I believe both static and dynamic libraries can be make via gcc. Is > > there a > > > > difference and where should I put them in Hadoop? > > > > > > > > Thanks, > > > > > > > > - Chang > > > > > > > > On Sat, Feb 23, 2008 at 10:28 AM, 11 Nov. <[EMAIL PROTECTED]> > > wrote: > > > > > > > > > > > >> Are your external C++ libraries statically linked? > > > >> > > > >> 2008/2/22, Chang Hu <[EMAIL PROTECTED]>: > > > > > > >> > > > >>> Hi, > > > >>> > > > >>> I have an image processing library in C++ and want to run it as a > > > >>> MapReduce > > > >>> job via JNI. While I have some idea about how to include an > > external JAR > > > >>> into MapReduce, I am not sure how that works with external C++ > > > >>> > > > >> libraries. > > > >> > > > >>> It could be easier to use HadoopStreaming, but I am not sure how to > > do > > > >>> that, > > > >>> either. > > > >>> > > > >>> Suggestions? > > > >>> > > > >>> Thanks, > > > >>> > > > >>> - Chang > > > >>> > > > >>> -- > > > >>> --------------- > > > >>> Überstehen ist alles. > > > >>> > > > >>> > > > >>> > > > >>> Chang Hu > > > >>> Ph.D. student > > > >>> Computer Science Department > > > >>> University of Maryland > > > >>> > > > >>> > > > > > > > > > > > > > > > > > > > > > > -- > > > Jason Venner > > > Attributor - Publish with Confidence <http://www.attributor.com/> > > > Attributor is hiring Hadoop Wranglers, contact if interested > > > > > > > >