Hi Mogrob,
The issue arises because you either:
1. don't have the configuration XML files in your CLASSPATH when you
run your libhdfs-linked program, or
2. Have configured the local filesystem to be the default in your
configuration files.
To fix it you could either
1. put the configuratio
Devi,
The libhdfs is purely a client library and hence you require it only
on the node where you wish to consume it. Hence, "client node" is
sufficient.
On Fri, May 18, 2012 at 10:50 PM, Hadoop wrote:
> Harsh,
>
> Thanks for the response.I was able to install it.
>
> Sh
27;d hdfs-user@].
>
> The libhdfs is installable via package "hadoop-0.20-libhdfs" (sudo yum
> install hadoop-0.20-libhdfs.x86_64 for example).
>
> The package installs the following files so you can include/link globally:
> /usr/include/hdfs.h
> /usr/lib64/li
Devi,
[Moving question to cdh-u...@cloudera.org, bcc'd hdfs-user@].
The libhdfs is installable via package "hadoop-0.20-libhdfs" (sudo yum
install hadoop-0.20-libhdfs.x86_64 for example).
The package installs the following files so you can include/link globally:
/usr/include/hd
I am currently using CDUH3 distibution and I could nto find libhdfs in the
distribution.
Where could I get the libhdfs.so and the header file ?
Thanks, Devi
Hi all,
I wanted to write a c++ code using libhdfs API. To perform a simple test, I
compiled and ran the sample program at http://hadoop.apache.org/common/docs/r0.
20.2/libhdfs.html. It worked ed exited correctly, but instead of creating
/tmp/textfile.txt in hdfs it created it on the local
Hi,
I am facing an issue with using libhdfs where hdfs only recognizes data files
copied to /tmp(default). In case I copy data(using hdfs commands) to another
directory specifed in hdfs-site.xml(dfs.data.dir) hdfsExists call fails with
error. HADOOP_USER is set and has access to this
Hi,
I am using hadoop distribution 0.21 and there was an issue with using
precompiled libhdfs.a that certain symbols like stderr were not found. I
decided to skip the library and link in the source files($HADOOP_HOME/
hdfs/src/c++/libhdfs) directly into my project. I had to comment out the
ailto:mlor...@uci.cu]
Sent: Wednesday, March 07, 2012 7:36 PM
To: Amritanshu Shekhar
Subject: Re: Error while using libhdfs C API
On 03/07/2012 01:15 AM, Amritanshu Shekhar wrote:
Which platform are you using?
Did you update the dynamic linker runtime bindings (ldconfig)?
ldconfig $HOME/hadoop/c++/Linux-amd64/lib
Regards
On 03/06/2012 02:38 AM, Amritanshu Shekhar wrote:
Hi,
I was trying to link 64 bit libhdfs in my application program but it
seems there is an issue
Hi,
I was trying to link 64 bit libhdfs in my application program but it seems
there is an issue with this library. Get the following error:
Undefined first referenced
symbol in file
stderr libhdfs.a(hdfs.o
Hi folks,
I'm trying to compile Hadoop v1.0.0 libhdfs on Ubuntu 11.10 64-bit. All
java classes compile fine except when I specify -Dislibhdfs=1. Here's
where the error starts:
[exec] In file included from
/usr/include/x86_64-linux-gnu/sys/select.h:46:0,
[exec]
I did it a while back for the Condor Project, though I don't think the
code was ever merged upstream. If you're using Visual Studio, you
basically need to convert libhdfs to use C89. Tedious, but doable.
On Mon, Dec 5, 2011 at 5:23 PM, Mehul Choube wrote:
> Hi,
>
> Has anyon
Hi,
Has anyone ported libhdfs to windows platform? If yes how was the experience?
Thanks,
Mehul
Inder,
Can you instead try, from HADOOP_HOME:
ant -Dcompile.c++=set -Dlibhdfs=set compile-c++-libhdfs
On 10-Nov-2011, at 7:59 AM, inder.p...@gmail.com wrote:
> Calling make inside the libhdfs src, which uses libtool to produce the lib.
>
> Inder
> Sent from BlackBerr
Calling make inside the libhdfs src, which uses libtool to produce the lib.
Inder
Sent from BlackBerry® on Airtel
-Original Message-
From: Harsh J
Date: Wed, 9 Nov 2011 20:58:05
To:
Reply-To: hdfs-user@hadoop.apache.org
Subject: Re: error building libhdfs
Inder,
Can you post your
Inder,
Can you post your invocation command?
On 09-Nov-2011, at 11:38 AM, Inder Pall wrote:
> facing the following
>
>
> if /bin/sh ./libtool --mode=compile --tag=CC gcc
> -DPACKAGE_NAME=\"libhdfs\" -DPACKAGE_TARNAME=\"libhdfs\"
> -DPACKAGE_VERSION=\&quo
facing the following
if /bin/sh ./libtool --mode=compile --tag=CC gcc
-DPACKAGE_NAME=\"libhdfs\" -DPACKAGE_TARNAME=\"libhdfs\"
-DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"libhdfs\ 0.1.0\"
-DPACKAGE_BUGREPORT=\"omal...@apache.org\" -DPACKAG
++=true -Dlibhdfs=1 -Dlibrecordio=true -Dxercescroot=$XERCES_HOME
-Declipse.home=$ECLIPSE_HOME -Djdiff.home=$JDIFF_HOME -Djava5.home=
$JAVA5_HOME -Dforrest.home=$FORREST_HOME clean docs package-libhdfs
api-report tar test test-c++-libhdfs
RESULT=$?
if [ $RESULT != 0 ] ; then
echo "Build Faile
HDFS via C program linking
with libhdfs.
When I run my program, I see the following error from function hdfsOpenFile():
Exception in thread "main" java.io.IOException: Mkdirs failed to create
/2010/12/1/abc4
at
org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.
I am running Hadoop version 0.21.0
On Wed, Dec 1, 2010 at 8:54 PM, Rajat Goel wrote:
> Hi,
>
> I am a new user of Hadoop and I am trying to access HDFS via C program
> linking with libhdfs.
>
> When I run my program, I see the following error from function
> hdfsOpenFil
Hi,
I am a new user of Hadoop and I am trying to access HDFS via C program
linking with libhdfs.
When I run my program, I see the following error from function
hdfsOpenFile():
Exception in thread "main" java.io.IOException: Mkdirs failed to create
/2010/12/1/abc4
Should the third party tools(like Scribe, Chukwa..etc) access HDFS only
through libhdfs interface??
any suggestions !
Thanks,
Gokul
Hi,
I am trying to install libhdfs for Scribe-HDFS integration.
I am using hadoop0.20.1in SUSE linux 10 sp 2 .
These are the steps I follow.
in hadoop/src/c++/libhdfs
./configure --enable-static --disable-shared
24 matches
Mail list logo