By the way, the fs.default.name is 192.168.1.21:54310. I checked the HDFS, it works well. I installed and ran both HDFS and Hama using the same linux account.
2012/9/12 顾荣 <[email protected]> > Thanks so much, Edward. > > I fellowed your suggestion and instanlled a hadoop 0.20.2 instead for > Hama. However, this time when I start Hama, a fatal happened and the > bspmaster daemon can not start up. The corresponding error message in the > baspmaster log file shows as below. > > ************************************************************/ > 2012-09-12 14:40:38,238 INFO org.apache.hama.BSPMasterRunner: STARTUP_MSG: > /************************************************************ > STARTUP_MSG: Starting BSPMaster > STARTUP_MSG: host = slave021/192.168.1.21 > STARTUP_MSG: args = [] > STARTUP_MSG: version = 1.0.0 > STARTUP_MSG: build = > https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r > 1214675; compiled by 'hortonfo' on Fri Dec 16 20:01:27 UTC 2011 > ************************************************************/ > 2012-09-12 14:40:38,414 INFO org.apache.hama.bsp.BSPMaster: RPC BSPMaster: > host slave021 port 40000 > 2012-09-12 14:40:38,502 INFO org.apache.hadoop.ipc.Server: Starting > SocketReader > 2012-09-12 14:40:38,542 INFO org.mortbay.log: Logging to > org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via > org.mortbay.log.Slf4jLog > 2012-09-12 14:40:38,583 INFO org.apache.hama.http.HttpServer: Port > returned by webServer.getConnectors()[0].getLocalPort() before open() is > -1. Opening the listener on 40013 > 2012-09-12 14:40:38,584 INFO org.apache.hama.http.HttpServer: > listener.getLocalPort() returned 40013 > webServer.getConnectors()[0].getLocalPort() returned 40013 > 2012-09-12 14:40:38,584 INFO org.apache.hama.http.HttpServer: Jetty bound > to port 40013 > 2012-09-12 14:40:38,584 INFO org.mortbay.log: jetty-6.1.14 > 2012-09-12 14:40:38,610 INFO org.mortbay.log: Extract > jar:file:/home/hadoop/hama_installs/hama-0.5.0/hama-core-0.5.0.jar!/webapp/bspmaster/ > to /tmp/Jetty_slave021_40013_bspmaster____.1tzgsz/webapp > 2012-09-12 14:41:16,073 INFO org.mortbay.log: Started > SelectChannelConnector@slave021:40013 > 2012-09-12 14:41:16,218 ERROR org.apache.hama.bsp.BSPMaster: Can't get > connection to Hadoop Namenode! > java.io.IOException: Call to /192.168.1.21:54310 failed on local > exception: java.io.EOFException > at org.apache.hadoop.ipc.Client.wrapException(Client.java:1103) > at org.apache.hadoop.ipc.Client.call(Client.java:1071) > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225) > at $Proxy5.getProtocolVersion(Unknown Source) > at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396) > at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379) > at > org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119) > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238) > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203) > at > org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89) > at > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1386) > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66) > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1404) > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:254) > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:123) > at org.apache.hama.bsp.BSPMaster.<init>(BSPMaster.java:299) > at org.apache.hama.bsp.BSPMaster.startMaster(BSPMaster.java:454) > at org.apache.hama.bsp.BSPMaster.startMaster(BSPMaster.java:449) > at org.apache.hama.BSPMasterRunner.run(BSPMasterRunner.java:46) > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79) > at org.apache.hama.BSPMasterRunner.main(BSPMasterRunner.java:56) > Caused by: java.io.EOFException > at java.io.DataInputStream.readInt(DataInputStream.java:392) > at > org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:800) > at org.apache.hadoop.ipc.Client$Connection.run(Client.java:745) > 2012-09-12 14:41:16,222 FATAL org.apache.hama.BSPMasterRunner: > java.lang.NullPointerException > at org.apache.hama.bsp.BSPMaster.getSystemDir(BSPMaster.java:862) > at org.apache.hama.bsp.BSPMaster.<init>(BSPMaster.java:308) > at org.apache.hama.bsp.BSPMaster.startMaster(BSPMaster.java:454) > at org.apache.hama.bsp.BSPMaster.startMaster(BSPMaster.java:449) > at org.apache.hama.BSPMasterRunner.run(BSPMasterRunner.java:46) > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79) > at org.apache.hama.BSPMasterRunner.main(BSPMasterRunner.java:56) > > 2012-09-12 14:41:16,223 INFO org.apache.hama.BSPMasterRunner: > SHUTDOWN_MSG: > /************************************************************ > SHUTDOWN_MSG: Shutting down BSPMaster at slave021/192.168.1.21 > ************************************************************/ > > Would please give me some tips again? > > Thanks,again. > > walker > > 2012/9/12 Edward J. Yoon <[email protected]> > > Unfortunately we don't support Hadoop secure version yet. >> >> Instead of 0.20.205, Please use hadoop non-secure 0.20.2 or 1.0.3 >> versions. >> >> Thanks. >> >> On Wed, Sep 12, 2012 at 11:25 AM, 顾荣 <[email protected]> wrote: >> > Hi,all. >> > >> > I set up a hama cluster of 3 nodes and start hama successfully. However, >> > when I run the pi example, the job failed with a very strange message as >> > below. >> > >> > hama jar /home/hadoop/hama_installs/hama-0.5.0/hama-examples-0.5.0.jar >> pi >> > org.apache.hadoop.ipc.RemoteException: java.io.IOException: >> > java.lang.NoSuchMethodException: >> > org.apache.hadoop.hdfs.protocol.ClientProtocol.create(java.lang.String, >> > org.apache.hadoop.fs.permission.FsPermission, java.lang.String, boolean, >> > boolean, short, long) >> > at java.lang.Class.getMethod(Class.java:1605) >> > at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:557) >> > at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388) >> > at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384) >> > at java.security.AccessController.doPrivileged(Native Method) >> > at javax.security.auth.Subject.doAs(Subject.java:396) >> > at >> > >> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059) >> > at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382) >> > >> > at org.apache.hadoop.ipc.Client.call(Client.java:1066) >> > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225) >> > at $Proxy2.create(Unknown Source) >> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> > at >> > >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) >> > at >> > >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >> > at java.lang.reflect.Method.invoke(Method.java:616) >> > at >> > >> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82) >> > at >> > >> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59) >> > at $Proxy2.create(Unknown Source) >> > at >> > >> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.<init>(DFSClient.java:3245) >> > at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:713) >> > at >> > >> org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:182) >> > at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:555) >> > at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:536) >> > at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:443) >> > at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:229) >> > at >> > org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1195) >> > at >> > org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1171) >> > at >> > org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1143) >> > at >> > >> org.apache.hama.bsp.BSPJobClient.submitJobInternal(BSPJobClient.java:349) >> > at org.apache.hama.bsp.BSPJobClient.submitJob(BSPJobClient.java:294) >> > at org.apache.hama.bsp.BSPJob.submit(BSPJob.java:218) >> > at org.apache.hama.bsp.BSPJob.waitForCompletion(BSPJob.java:225) >> > at org.apache.hama.examples.PiEstimator.main(PiEstimator.java:139) >> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> > at >> > >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) >> > at >> > >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >> > at java.lang.reflect.Method.invoke(Method.java:616) >> > at >> > >> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68) >> > at >> org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139) >> > at >> org.apache.hama.examples.ExampleDriver.main(ExampleDriver.java:39) >> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> > at >> > >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) >> > at >> > >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >> > at java.lang.reflect.Method.invoke(Method.java:616) >> > at org.apache.hama.util.RunJar.main(RunJar.java:147) >> > >> > My hama verison is 0.5 and hadoop version is 0.20.205. This error seems >> to >> > comes from the "org.apache.hadoop.hdfs.protocol.ClientProtocol.create" >> > method, this is a normal method. I am kind of confused... >> > >> > Thanks in advance. >> > >> > walker >> >> >> >> -- >> Best Regards, Edward J. Yoon >> @eddieyoon >> > >
