Hello, 

I am getting the above exception:

fa...@nodo1:~$
cat /home/fabio/hadoop/bin/../logs/hadoop-fabio-datanode-nodo1.log
2010-11-09 01:14:22,163 INFO
org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: Total blocks:
2, missing metadata files:0, missing block files:0, missing blocks in
memory:0, mismatched blocks:0
2010-11-09 02:43:35,558 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode: BlockReport of 2 blocks
got processed in 6 msecs
2010-11-09 07:14:22,695 INFO
org.apache.hadoop.hdfs.server.datanode.DirectoryScanner: Total blocks:
2, missing metadata files:0, missing block files:0, missing blocks in
memory:0, mismatched blocks:0
[...]
STARTUP_MSG: Starting DataNode
STARTUP_MSG:   host = nodo1/192.168.1.11
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 0.21.0
STARTUP_MSG:   classpath
= 
/home/fabio/hadoop/bin/../conf:/usr/lib/jvm/java-6-sun/lib/tools.jar:/home/fabio/hadoop/bin/..:/home/fabio/hadoop/bin/../hadoop-common-0.21.0.jar:/home/fabio/hadoop/bin/../hadoop-common-test-0.21.0.jar:/home/fabio/hadoop/bin/../hadoop-hdfs-0.21.0.jar:/home/fabio/hadoop/bin/../hadoop-hdfs-0.21.0-sources.jar:/home/fabio/hadoop/bin/../hadoop-hdfs-ant-0.21.0.jar:/home/fabio/hadoop/bin/../hadoop-hdfs-test-0.21.0.jar:/home/fabio/hadoop/bin/../hadoop-hdfs-test-0.21.0-sources.jar:/home/fabio/hadoop/bin/../hadoop-mapred-0.21.0.jar:/home/fabio/hadoop/bin/../hadoop-mapred-0.21.0-sources.jar:/home/fabio/hadoop/bin/../hadoop-mapred-examples-0.21.0.jar:/home/fabio/hadoop/bin/../hadoop-mapred-test-0.21.0.jar:/home/fabio/hadoop/bin/../hadoop-mapred-tools-0.21.0.jar:/home/fabio/hadoop/bin/../lib/ant-1.6.5.jar:/home/fabio/hadoop/bin/../lib/asm-3.2.jar:/home/fabio/hadoop/bin/../lib/aspectjrt-1.6.5.jar:/home/fabio/hadoop/bin/../lib/aspectjtools-1.6.5.jar:/home/fabio/hadoop/bin/../lib/avro-1.3.2.jar:/home/fabio/hadoop/bin/../lib/commons-cli-1.2.jar:/home/fabio/hadoop/bin/../lib/commons-codec-1.4.jar:/home/fabio/hadoop/bin/../lib/commons-el-1.0.jar:/home/fabio/hadoop/bin/../lib/commons-httpclient-3.1.jar:/home/fabio/hadoop/bin/../lib/commons-lang-2.5.jar:/home/fabio/hadoop/bin/../lib/commons-logging-1.1.1.jar:/home/fabio/hadoop/bin/../lib/commons-logging-api-1.1.jar:/home/fabio/hadoop/bin/../lib/commons-net-1.4.1.jar:/home/fabio/hadoop/bin/../lib/core-3.1.1.jar:/home/fabio/hadoop/bin/../lib/ftplet-api-1.0.0.jar:/home/fabio/hadoop/bin/../lib/ftpserver-core-1.0.0.jar:/home/fabio/hadoop/bin/../lib/ftpserver-deprecated-1.0.0-M2.jar:/home/fabio/hadoop/bin/../lib/hsqldb-1.8.0.10.jar:/home/fabio/hadoop/bin/../lib/jackson-core-asl-1.4.2.jar:/home/fabio/hadoop/bin/../lib/jackson-mapper-asl-1.4.2.jar:/home/fabio/hadoop/bin/../lib/jasper-compiler-5.5.12.jar:/home/fabio/hadoop/bin/../lib/jasper-runtime-5.5.12.jar:/home/fabio/hadoop/bin/../lib/jdiff-1.0.9.jar:/home/fabio/hadoop/bin/../lib/jets3t-0.7.1.jar:/home/fabio/hadoop/bin/../lib/jetty-6.1.14.jar:/home/fabio/hadoop/bin/../lib/jetty-util-6.1.14.jar:/home/fabio/hadoop/bin/../lib/jsp-2.1-6.1.14.jar:/home/fabio/hadoop/bin/../lib/jsp-api-2.1-6.1.14.jar:/home/fabio/hadoop/bin/../lib/junit-4.8.1.jar:/home/fabio/hadoop/bin/../lib/kfs-0.3.jar:/home/fabio/hadoop/bin/../lib/log4j-1.2.15.jar:/home/fabio/hadoop/bin/../lib/mina-core-2.0.0-M5.jar:/home/fabio/hadoop/bin/../lib/mockito-all-1.8.2.jar:/home/fabio/hadoop/bin/../lib/oro-2.0.8.jar:/home/fabio/hadoop/bin/../lib/paranamer-2.2.jar:/home/fabio/hadoop/bin/../lib/paranamer-ant-2.2.jar:/home/fabio/hadoop/bin/../lib/paranamer-generator-2.2.jar:/home/fabio/hadoop/bin/../lib/qdox-1.10.1.jar:/home/fabio/hadoop/bin/../lib/servlet-api-2.5-6.1.14.jar:/home/fabio/hadoop/bin/../lib/slf4j-api-1.5.11.jar:/home/fabio/hadoop/bin/../lib/slf4j-log4j12-1.5.11.jar:/home/fabio/hadoop/bin/../lib/xmlenc-0.52.jar:/home/fabio/hadoop/bin/../lib/jsp-2.1/*.jar:/home/fabio/hadoop/hdfs/bin/../conf:/home/fabio/hadoop/hdfs/bin/../hadoop-hdfs-*.jar:/home/fabio/hadoop/hdfs/bin/../lib/*.jar:/home/fabio/hadoop/bin/../mapred/conf:/home/fabio/hadoop/bin/../mapred/hadoop-mapred-*.jar:/home/fabio/hadoop/bin/../mapred/lib/*.jar:/home/fabio/hadoop/hdfs/bin/../hadoop-hdfs-*.jar:/home/fabio/hadoop/hdfs/bin/../lib/*.jar
STARTUP_MSG:   build =
https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.21 -r
985326; compiled by 'tomwhite' on Tue Aug 17 01:02:28 EDT 2010
************************************************************/
2010-11-09 15:38:38,106 INFO org.apache.hadoop.security.Groups: Group
mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping;
cacheTimeout=300000
2010-11-09 15:38:38,255 ERROR
org.apache.hadoop.hdfs.server.datanode.DataNode:
java.lang.IllegalArgumentException: Invalid URI for NameNode address
(check fs.defaultFS): file:/// has no authority.
        at
org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:214)
        at
org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:237)
        at
org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1440)
        at
org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1393)
        at
org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1407)
        at
org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1552)

2010-11-09 15:38:38,275 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG: 
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at nodo1/192.168.1.11
************************************************************/
fa...@nodo1:~$ 



Anyone know where can I set fs.defaultFS to solve this error ?

thanks,

-- 
Fabio A. Miranda
[email protected]
+506 8354.0324

Reply via email to