After creation and startup of the hadoop namenode on AIX or Solaris, you will 
only be allowed to connect to the namenode via hostname but not IP.
-------------------------------------------------------------------------------------------------------------------------------------------------

                 Key: HADOOP-5191
                 URL: https://issues.apache.org/jira/browse/HADOOP-5191
             Project: Hadoop Core
          Issue Type: Bug
          Components: dfs
    Affects Versions: 0.18.2
         Environment: AIX 6.1 or Solaris
            Reporter: Bill Habermaas
            Priority: Minor


After creation and startup of the hadoop namenode on AIX or Solaris, you will 
only be allowed to connect to the namenode via hostname but not IP.

fs.default.name=hdfs://p520aix61.mydomain.com:9000
Hostname for box is p520aix and the IP is 10.120.16.68

If you use the following url, "hdfs://10.120.16.68", to connect to the 
namenode, the exception that appears below occurs. You can only connect 
successfully if "hdfs://p520aix61.mydomain.com:9000" is used. 

Exception in thread "Thread-0" java.lang.IllegalArgumentException: Wrong FS: 
hdfs://10.120.16.68:9000/testdata, expected: hdfs://p520aix61.mydomain.com:9000
        at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:320)
        at 
org.apache.hadoop.dfs.DistributedFileSystem.checkPath(DistributedFileSystem.java:84)
        at 
org.apache.hadoop.dfs.DistributedFileSystem.getPathName(DistributedFileSystem.java:122)
        at 
org.apache.hadoop.dfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:390)
        at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:667)
        at TestHadoopHDFS.run(TestHadoopHDFS.java:116)

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to