Ok, looks like I finally was able to setup FUSE-HDFS on SLES10 box (using 
Hadoop-0.20.1 distribution)
As it took me a while to figure this out, I thought to share these steps with 
the Universe.

1. In general, my SLES10 system runs with 2.6.16.60 kernel. The OS is 64-bit.
2. Download and make FUSE. I did it with fuse-2.8.0.tar.gz. Regular configure & 
make & make install worked
3. modprobe fuse
4. export JAVA_HOME=<wherever your 64-bit Java 1.6 resides>
5. export HADOOP_HOME=<wherever your 0.20.1 hadoop lives>
6. vi  $HADOOP_HOME/src/contrib/fuse-dfs/src/fuse_dfs_wrapper.sh   - update the 
variables (HADOOP_HOME, JAVA_HOME, ...)
7. cd $HADOOP_HOME
8. setenv ANT_OPTS "-Dhttp.proxyHost=<your HTTP proxy> -Dhttp.proxyPort=<port>"
   This would allow you to build everything if you are behind the firewall
9. Make sure your default autoconf is v2.61, and your default ant is 1.7.1  - 
I've had problems with older versions
10. ant compile -Dcompile.c++=true -Dlibhdfs=true
11. ln -s $HADOOP_HOME/build/c++/Linux-amd64-64/lib $HADOOP_HOME/build/libhdfs
12. export 
LD_LIBRARY_PATH=/usr/lib:/usr/local/lib:$HADOOP_HOME/build/libhdfs:$JAVA_HOME/jre/lib/amd64/server
13. ant compile-contrib -Dlibhdfs=1 -Dfusedfs=1

If everything is ok till this point, you should be ready to go. Follow 
http://wiki.apache.org/hadoop/MountableHDFS procedure after INSTALLING section

-----Original Message-----
From: Ryan Smith [mailto:ryan.justin.sm...@gmail.com] 
Sent: Thursday, September 10, 2009 4:04 PM
To: common-user@hadoop.apache.org
Subject: Re: Building libhdfs.so in Hadoop 0.20.1

Maybe someone can correct me if im wrong, but this is what I did to get
libhdfs on 0.20.0 to build:

NOTE: on debian, you need to apply a patch:
https://issues.apache.org/jira/browse/HADOOP-5611

Compile libhdfs:  ant compile-contrib -Dlibhdfs=1
Then to install libhdfs in the local hadoop lib:   ant package -Dlibhdfs=1
If the ant package -Dlibhdfs=1 command fails due to Forrest, then you can
remove the docs from being packaged.
cd $HADOOP_HOME/
vi build.xml
Then change the package target, and remove the "javadoc", "docs", and
"cn-docs" targets from the dependencies and re-run the package command.
Finally, add a soft link for backwards compatibility so hdfs-fuse/fuse-dfs
can find the libhdfs.so file:
ln -s ./c++/{OS}-{ARCH}/lib build/libhdfs



On Thu, Sep 10, 2009 at 8:38 AM, Touretsky, Gregory <
gregory.touret...@intel.com> wrote:

>  Hi,
>
>
>
>    I have a problem building libhdfs.so in Hadoop 0.20.1
>
> From what I could see, the build process has changed significantly in
> 0.20.0 (as mentioned in http://issues.apache.org/jira/browse/HADOOP-3344),
> and "ant compile-libhdfs -Dlibhdfs=1" can't be used anymore.
>
> I'm trying to use standard autotools to build the libhdfs - but it always
> fails:
>
> >cd /tmp/hadoop-0.20.1/src/c++/libhdfs
>
> >./configure >& ~/tmp/configure.out   (see attached)
>
> > make
>
> if /bin/sh ./libtool --mode=compile --tag=CC
> /usr/intel/pkgs/gcc/4.4.0/bin/gcc -DPACKAGE_NAME=\"libhdfs\"
> -DPACKAGE_TARNAME=\"libhdfs\" -DPACKAGE_VERSION=\"0.1.0\"
> -DPACKAGE_STRING=\"libhdfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"
> omal...@apache.org\" -DPACKAGE=\"libhdfs\" -DVERSION=\"0.1.0\"
> -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1
> -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1
> -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\"
> -Dsize_t=unsigned\ int -DHAVE_FCNTL_H=1 -Dconst= -Dvolatile= -I. -I.     -g
> -O2 -DOS_LINUX -DDSO_DLFCN -DCPU=\"amd64\" -m 
> -I/usr/intel/pkgs/java/1.6.0.10/include
> -I/usr/intel/pkgs/java/1.6.0.10/include/linux<http://1.6.0.10/include%0A-I/usr/intel/pkgs/java/1.6.0.10/include/linux>-Wall
>  -Wstrict-prototypes -MT hdfs.lo -MD -MP -MF ".deps/hdfs.Tpo" -c -o
> hdfs.lo hdfs.c; \
>
> then mv -f ".deps/hdfs.Tpo" ".deps/hdfs.Plo"; else rm -f ".deps/hdfs.Tpo";
> exit 1; fi
>
> libtool: compile:  /usr/intel/pkgs/gcc/4.4.0/bin/gcc
> -DPACKAGE_NAME=\"libhdfs\" -DPACKAGE_TARNAME=\"libhdfs\"
> -DPACKAGE_VERSION=\"0.1.0\" "-DPACKAGE_STRING=\"libhdfs 0.1.0\""
> -DPACKAGE_BUGREPORT=\"omal...@apache.org\" -DPACKAGE=\"libhdfs\"
> -DVERSION=\"0.1.0\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1
> -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1
> -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1
> -DHAVE_DLFCN_H=1 -DLT_OBJDIR=\".libs/\" "-Dsize_t=unsigned int"
> -DHAVE_FCNTL_H=1 -Dconst= -Dvolatile= -I. -I. -g -O2 -DOS_LINUX -DDSO_DLFCN
> -DCPU=\"amd64\" -m -I/usr/intel/pkgs/java/1.6.0.10/include
> -I/usr/intel/pkgs/java/1.6.0.10/include/linux<http://1.6.0.10/include%0A-I/usr/intel/pkgs/java/1.6.0.10/include/linux>-Wall
>  -Wstrict-prototypes -MT hdfs.lo -MD -MP -MF .deps/hdfs.Tpo -c hdfs.c
> -fPIC -DPIC -o .libs/hdfs.o
>
> cc1: error: unrecognized command line option "-m"
>
> make: *** [hdfs.lo] Error 1
>
>
>
> Any idea how to do it now?
> Also, it's probably time to update the
> http://wiki.apache.org/hadoop/MountableHDFS guide.
>
>
>
> Thanks in advance,
>
>    Gregory Touretsky
>
> ---------------------------------------------------------------------
> Intel Israel (74) Limited
>
> This e-mail and any attachments may contain confidential material for
> the sole use of the intended recipient(s). Any review or distribution
> by others is strictly prohibited. If you are not the intended
> recipient, please contact the sender and delete all copies.
>
>
---------------------------------------------------------------------
Intel Israel (74) Limited

This e-mail and any attachments may contain confidential material for
the sole use of the intended recipient(s). Any review or distribution
by others is strictly prohibited. If you are not the intended
recipient, please contact the sender and delete all copies.

Reply via email to