You haven't build libhdfs.  You can do that with  ant
compile-c++-libhdfs -Dcompile.c++=true

On Sun, Jul 31, 2011 at 10:26 PM, Aastha Mehta <aasth...@gmail.com> wrote:
> The command works correctly. But I still get the error for running the
> fuse_dfs_wrapper.sh script:
>
> ./fuse_dfs: error while loading shared libraries: libhdfs.so.0: cannot open
> shared object file: No such file or directory
>
> Aastha.
>
> On 1 August 2011 10:03, Arun C Murthy <a...@hortonworks.com> wrote:
>
>> Run the following command:
>>
>> $ ant -Dcompile.native=true package
>>
>> Arun
>>
>> On Jul 31, 2011, at 9:20 PM, Aastha Mehta wrote:
>>
>> > Hi Arun,
>> >
>> > Thanks for the prompt reply. I am not sure, I understood you correctly.
>> > Compile/binary/tar of what? The native files? The
>> lib/native/Linux-amd64-64/
>> > contains following files:
>> > libhadoop.a
>> > libhadoop.la
>> > libhadoop.so
>> > libhadoop.so.1
>> > libhadoop.so.1.0.0
>> >
>> > This directory is present in the package itself. So, should I make a tar
>> of
>> > it and then provide it? I tried the following, but it failed:
>> > ant -Dcompile.native=true
>> > $HADOOP_HOME/lib/native/Linux-amd64-64/libhadoop.so
>> >
>> > The error I got is - "Target  lib/native/Linux-amd64-64/libhadoop.so does
>> > not exist in the project Hadoop".
>> >
>> > Thanks,
>> > Aastha.
>> >
>> > On 1 August 2011 09:44, Arun Murthy <a...@hortonworks.com> wrote:
>> >
>> >> <target> could be compile or binary or tar.
>> >>
>> >> Arun
>> >>
>> >> Sent from my iPhone
>> >>
>> >> On Jul 31, 2011, at 9:05 PM, Aastha Mehta <aasth...@gmail.com> wrote:
>> >>
>> >>> Hello,
>> >>>
>> >>> I am trying to run fuse_dfs_wrapper.sh from
>> >>> hadoop-0.20.2/src/contrib/fuse_dfs/src on a 64-bit machine. I get the
>> >>> following error:
>> >>> ./fuse_dfs: error while loading shared libraries: libhdfs.so.0: cannot
>> >> open
>> >>> shared object file: No such file or directory
>> >>>
>> >>> I searched on the net and found a response to a similar query here -
>> >>>
>> >>
>> https://groups.google.com/a/cloudera.org/group/cdh-user/browse_thread/thread/3db7efc10cff8bbc?pli=1
>> >>>
>> >>> My hadoop package contains the native files in
>> >>> hadoop-0.20.2/lib/native/Linux-amd64-64/
>> >>>
>> >>> I followed to this link -
>> >>> http://hadoop.apache.org/common/docs/current/native_libraries.html to
>> >>> understand the steps to build hadoop native libraries.
>> >>>
>> >>> I have a small query regarding the building step. On the above link, it
>> >> is
>> >>> mentioned -
>> >>>
>> >>> "Once you installed the prerequisite packages use the standard hadoop
>> >>> build.xml file and pass along the compile.native flag (set to true) to
>> >> build
>> >>> the native hadoop library:
>> >>>
>> >>> $ ant -Dcompile.native=true <target>
>> >>>
>> >>> You should see the newly-built library in:
>> >>>
>> >>> $ build/native/<platform>/lib
>> >>>
>> >>> where <platform> is a combination of the system-properties: ${os.name
>> >>> }-${os.arch}-${sun.arch.data.model} (for example, Linux-i386-32)."
>> >>>
>> >>>
>> >>> Could someone please tell what exactly is <target> in the first step.
>> >>>
>> >>>
>> >>> Thanks and regards,
>> >>>
>> >>> Aastha.
>> >>>
>> >>>
>> >>>
>> >>>
>> >>>
>> >>> --
>> >>> Aastha Mehta
>> >>> B.E. (Hons.) Computer Science
>> >>> BITS Pilani
>> >>> E-mail: aasth...@gmail.com
>> >>
>> >
>> >
>> >
>> > --
>> > Aastha Mehta
>> > B.E. (Hons.) Computer Science
>> > BITS Pilani
>> > E-mail: aasth...@gmail.com
>>
>>
>
>
> --
> Aastha Mehta
> B.E. (Hons.) Computer Science
> BITS Pilani
> E-mail: aasth...@gmail.com
>

Reply via email to