Hi,
  Sorry for bringing up the topic again. I tried a lot many times, but I am
unable to get it working.
The C interface (LibHDFS) gives me segmentation fault even with the sample
program given with the package.
Here are the steps I followed.

--> Installed jdk1.5.0_14 , set $JAVA_HOME to point to it.
--> I am able to run  bin/start-all.sh and the jobs are running successfully
(saw the log files)
--> I move onto hadoop-0.15.3/src/c++/libhdfs and issue the make command,
able to compile after fixing "jni.h" not found error.
--> Added hadoop-core.jar and /conf to CLASSPATH (As suggested by Arun C
Murthy).

test-libhdfs.sh: line 83:  8396 Segmentation fault      (core dumped)
CLASSPATH=$HADOOP_HOME/conf:
$CLASSPATH LD_PRELOAD="$HADOOP_HOME/libhdfs/libhdfs.so"
$LIBHDFS_BUILD_DIR/$HDFS_TEST

What should I do?
Is there any way out?

-- 
Regards,
Raghavendra K

Reply via email to