Hey Jay, I believe this may be related to your other issues as well, but 50070 is NOT the port you want to connect to. 50070 serves over HTTP, while default port (fs.default.name), for IPC connections is 8020, or whatever you have configured.
On 31-Oct-2011, at 5:17 AM, Jay Vyas wrote: > Hi guys : What is the meaning of an EOF exception when trying to connect > to Hadoop by creating a new FileSystem object ? Does this simply mean > the system cant be read ? > > java.io.IOException: Call to /172.16.112.131:50070 failed on local > exception: java.io.EOFException > at org.apache.hadoop.ipc.Client.wrapException(Client.java:1139) > at org.apache.hadoop.ipc.Client.call(Client.java:1107) > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226) > at $Proxy0.getProtocolVersion(Unknown Source) > at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:398) > at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:384) > at > org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:111) > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:213) > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:180) > at > org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89) > at > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1514) > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67) > at > org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:1548) > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1530) > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:228) > at sb.HadoopRemote.main(HadoopRemote.java:35) > Caused by: java.io.EOFException > at java.io.DataInputStream.readInt(DataInputStream.java:375) > at > org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:812) > at org.apache.hadoop.ipc.Client$Connection.run(Client.java:720) > > -- > Jay Vyas > MMSB/UCHC
