I have recently rebuilt a server with centos 6.0 and it seems that
something caused hadoop-fuse to get confused and it is no longer able
to find libjvm.so. The error i get is
find: `/usr/lib/jvm/java-1.6.0-sun-1.6.0.14/jre//jre/lib': No such
file or directory
/usr/lib/hadoop-0.20/bin/fuse_dfs: error while loading shared
libraries: libjvm.so: cannot open shared object file: No such file or
directory
A dirty look around suggests /usr/lib/hadoop-0.20/bin/hadoop-config.sh
is setting JAVA_HOME to `/usr/lib/jvm/java-1.6.0-sun-1.6.0.14/jre/`
/usr/bin/hadoop-fuse-dfs has the following which adds an extra /jre/
to the path
for f in `find ${JAVA_HOME}/jre/lib -name client -prune -o -name
libjvm.so -exec dirname {} \;`; do
is there a need to specify the subfolder. I think it would make
things simpler to just change the above to
for f in `find ${JAVA_HOME} -name client -prune -o -name libjvm.so
-exec dirname {} \;`; do
The other option is to change
/usr/lib/hadoop-0.20/bin/hadoop-config.sh so it sets the path without
jre either remove ` /usr/lib/jvm/java-1.6.0-sun-1.6.0.*/jre/ \`. Or
reorder the search list so /usr/lib/jvm/java-1.6.0-sun-1.6.0.*/ \
is preferred
regards
John
hadoop-fuse-dfs
@@ -14,7 +14,7 @@
if [ "${LD_LIBRARY_PATH}" = "" ]; then
export LD_LIBRARY_PATH=/usr/lib
- for f in `find ${JAVA_HOME} -name client -prune -o -name libjvm.so
-exec dirname {} \;`; do
+ for f in `find ${JAVA_HOME}/jre/lib -name client -prune -o -name
libjvm.so -exec dirname {} \;`; do
export LD_LIBRARY_PATH=$f:${LD_LIBRARY_PATH}
done
fi
hadoop-config.sh
@@ -68,8 +68,8 @@
if [ -z "$JAVA_HOME" ]; then
for candidate in \
/usr/lib/jvm/java-6-sun \
- /usr/lib/jvm/java-1.6.0-sun-1.6.0.* \
/usr/lib/jvm/java-1.6.0-sun-1.6.0.*/jre/ \
+ /usr/lib/jvm/java-1.6.0-sun-1.6.0.* \
/usr/lib/j2sdk1.6-sun \
/usr/java/jdk1.6* \
/usr/java/jre1.6* \