Hi Mogrob, The issue arises because you either: 1. don't have the configuration XML files in your CLASSPATH when you run your libhdfs-linked program, or 2. Have configured the local filesystem to be the default in your configuration files.
To fix it you could either 1. put the configuration XML directory in your CLASSPATH. or 2. explicitly specify the HDFS URI you want to connect to in hdfsConnect. cheers. Colin On Tue, Apr 24, 2012 at 9:54 AM, mogrob.sa...@libero.it <mogrob.sa...@libero.it> wrote: > Hi all, > > I wanted to write a c++ code using libhdfs API. To perform a simple test, I > compiled and ran the sample program at > http://hadoop.apache.org/common/docs/r0. > 20.2/libhdfs.html. It worked ed exited correctly, but instead of creating > /tmp/textfile.txt in hdfs it created it on the local filesystem. > That seemed to be a very strange issue, so I tried the code below: > > #include "hdfs.h" > #include <stdio.h> > > int main(int argc, char **argv) { > hdfsFS hdfs = hdfsConnect("default", 0); > int num; > hdfsFileInfo* list = hdfsListDirectory(hdfs, "/", &num); > for (int i=0; i<num; i++) > printf("%s\n", list[i].mName); > hdfsDisconnect(hdfs); > } > > The output was, as I expected, the list of file in / [root directory of the > local filesystem]. So there's a problem: hdfsConnect() chooses as default the > local filesystem instead of hdfs' one. > > To help understanding the problem here are some more details. > > I'm working with a 64-bit machine, running debian, kernel 2.6.32-5 with > hadoop- > 0.20. The content of core-site.xml is: > > <?xml version="1.0"?> > <?xml-stylesheet type="text/xsl" href="configuration.xsl"?> > > <!-- Put site-specific property overrides in this file. --> > > <configuration> > <property> > <name>fs.default.name</name> > <value>hdfs://hydra1:54310/</value> > <description>The name of the default file system. A URI whose > scheme and authority determine the FileSystem implementation. The > uri's scheme determines the config property (fs.SCHEME.impl) naming > the FileSystem implementation class. The uri's authority is used to > determine the host, port, etc. for a filesystem.</description> > </property> > </configuration> > > and the output of echo $CLASSPATH is > > /usr/lib/hadoop-0.20/lib/ant-contrib-1.0b3.jar:/usr/lib/hadoop-0.20 > /lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop-0.20/lib/aspectjtools-1.6.5.jar: > /usr/lib/hadoop-0.20/lib/commons-cli-1.2.jar:/usr/lib/hadoop-0.20/lib/commons- > codec-1.4.jar:/usr/lib/hadoop-0.20/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop- > 0.20/lib/commons-el-1.0.jar:/usr/lib/hadoop-0.20/lib/commons-httpclient-3.1.jar: > /usr/lib/hadoop-0.20/lib/commons-lang-2.4.jar:/usr/lib/hadoop-0.20/lib/commons- > logging-1.0.4.jar:/usr/lib/hadoop-0.20/lib/commons-logging-api-1.0.4.jar: > /usr/lib/hadoop-0.20/lib/commons-net-1.4.1.jar:/usr/lib/hadoop-0.20/lib/core- > 3.1.1.jar:/usr/lib/hadoop-0.20/lib/guava-r09-jar:/usr/lib/hadoop-0.20 > /lib/hadoop-fairscheduler-0.20.2-cdh3u3.jar:/usr/lib/hadoop-0.20/lib/hsqldb- > 1.8.0.10.jar:/usr/lib/hadoop-0.20/lib/jackson-core-asl-1.5.2.jar: > /usr/lib/hadoop-0.20/lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hadoop-0.20 > /lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop-0.20/lib/jasper-runtime-5.5.12. > jar:/usr/lib/hadoop-0.20/lib/jets3t-0.6.1.jar:/usr/lib/hadoop-0.20/lib/jetty- > 6.1.26.cloudera.1.jar:/usr/lib/hadoop-0.20/lib/jetty-servlet-tester-6.1.26. > cloudera.1.jar:/usr/lib/hadoop-0.20/lib/jetty-util-6.1.26.cloudera.1.jar: > /usr/lib/hadoop-0.20/lib/jsch-0.1.42.jar:/usr/lib/hadoop-0.20/lib/junit-4.5.jar: > /usr/lib/hadoop-0.20/lib/kfs-0.2.2.jar:/usr/lib/hadoop-0.20/lib/log4j-1.2.15. > jar:/usr/lib/hadoop-0.20/lib/mockito-all-1.8.2.jar:/usr/lib/hadoop-0.20/lib/oro- > 2.0.8.jar:/usr/lib/hadoop-0.20/lib/servlet-api-2.5-20081211.jar:/usr/lib/hadoop- > 0.20/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hadoop-0.20/lib/slf4j-api-1.4.3. > jar:/usr/lib/hadoop-0.20/lib/slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop-0.20 > /lib/xmlenc-0.52.jar:/usr/lib/hadoop-0.20/hadoop-0.20.2-cdh3u3-ant.jar: > /usr/lib/hadoop-0.20/hadoop-0.20.2-cdh3u3-core.jar:/usr/lib/hadoop-0.20/hadoop- > 0.20.2-cdh3u3-examples.jar:/usr/lib/hadoop-0.20/hadoop-0.20.2-cdh3u3-test.jar: > /usr/lib/hadoop-0.20/hadoop-0.20.2-cdh3u3-tools.jar:/usr/lib/hadoop-0.20/hadoop- > ant-0.20.2-cdh3u3.jar:/usr/lib/hadoop-0.20/hadoop-ant.jar:/usr/lib/hadoop-0.20 > /hadoop-core-0.20.2-cdh3u3.jar:/usr/lib/hadoop-0.20/hadoop-core.jar: > /usr/lib/hadoop-0.20/hadoop-examples-0.20.2-cdh3u3.jar:/usr/lib/hadoop-0.20 > /hadoop-examples.jar:/usr/lib/hadoop-0.20/hadoop-test-0.20.2-cdh3u3.jar: > /usr/lib/hadoop-0.20/hadoop-test.jar:/usr/lib/hadoop-0.20/hadoop-tools-0.20.2- > cdh3u3.jar:/usr/lib/hadoop-0.20/hadoop-tools.jar:/usr/lib/hadoop-0.20/conf/core- > site.xml:/usr/lib/hadoop-0.20/conf/hdfs-site.xml > > Has anybody an idea of what could cause such kind of issue, and how could I > fix it? > > Thank you, > > Mogrob