Hello,

I am trying to run fuse_dfs_wrapper.sh from
hadoop-0.20.2/src/contrib/fuse_dfs/src on a 64-bit machine. I get the
following error:
./fuse_dfs: error while loading shared libraries: libhdfs.so.0: cannot open
shared object file: No such file or directory

I searched on the net and found a response to a similar query here -
https://groups.google.com/a/cloudera.org/group/cdh-user/browse_thread/thread/3db7efc10cff8bbc?pli=1

My hadoop package contains the native files in
hadoop-0.20.2/lib/native/Linux-amd64-64/

I followed to this link -
http://hadoop.apache.org/common/docs/current/native_libraries.html to
understand the steps to build hadoop native libraries.

I have a small query regarding the building step. On the above link, it is
mentioned -

"Once you installed the prerequisite packages use the standard hadoop
build.xml file and pass along the compile.native flag (set to true) to build
the native hadoop library:

$ ant -Dcompile.native=true <target>

You should see the newly-built library in:

$ build/native/<platform>/lib

where <platform> is a combination of the system-properties: ${os.name
}-${os.arch}-${sun.arch.data.model} (for example, Linux-i386-32)."


Could someone please tell what exactly is <target> in the first step.


Thanks and regards,

Aastha.





-- 
Aastha Mehta
B.E. (Hons.) Computer Science
BITS Pilani
E-mail: aasth...@gmail.com

Reply via email to