Wildan, > here is the stack trace, this happen when we create the directory > manually using hadoop dfs -mkdir :
As I said in my first answer, the Getting Started page states that you should not create the directory manually, please delete it and try again. J-D On Tue, Feb 3, 2009 at 10:34 PM, W <[email protected]> wrote: > On Tue, Feb 3, 2009 at 8:52 PM, Jean-Daniel Cryans <[email protected]> > wrote: > > I'm having a hard time following you here... So you solved your Upgrade > > problem? It would seem so. > > Sorry .., > > Yes ., i have solved it, don't know why, but after the sccond hbase > start, everythings running smoothly. > > > You say here that your NN is on port 54310 but it is commented in your > > hbase-site.xml, why? > > It's the old configuration, i use 38440 as port for namenode coz on > the first look on the jps output to > see the PID and comparing with PID on port number on netstat output, > but then i saw on namenode log > the NN port was 54310, so delete again the 38440 and use 54310 instead > as port number of NN. > > > > You also talk about a null pointer, can you post the stack trace? > > > > here is the stack trace, this happen when we create the directory > manually using hadoop dfs -mkdir : > > ---cut----------- > 2009-01-29 16:58:59,301 INFO org.apache.hadoop.hbase.master.HMaster: > vmName=Java HotSpot(TM) Server VM, vmVendor=Sun Microsystems Inc., > vmVersion=11.0-b15 > 2009-01-29 16:58:59,302 INFO org.apache.hadoop.hbase.master.HMaster: > vmInputArguments=[-Xmx1000m, -XX:+HeapDumpOnOutOfMemoryError, > -Dhbase.log.dir=/opt/hbase/bin/../logs, > -Dhbase.log.file=hbase-hadoop-master-tobeThink.log, > -Dhbase.home.dir=/opt/hbase/bin/.., -Dhbase.id.str=hadoop, > -Dhbase.root.logger=INFO,DRFA, > -Djava.library.path=/opt/hbase/bin/../lib/native/Linux-i386-32] > 2009-01-29 16:58:59,744 ERROR org.apache.hadoop.hbase.master.HMaster: > Can not start master > java.io.IOException: Call to > tobethink.pappiptek.lipi.go.id/192.168.107.119:54310 failed on local > exception: null > at org.apache.hadoop.ipc.Client.call(Client.java:699) > at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:216) > at $Proxy0.getProtocolVersion(Unknown Source) > at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:319) > at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:104) > at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:177) > at > org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:74) > at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1367) > at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:56) > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1379) > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:215) > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:120) > at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:186) > at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:156) > at > org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:96) > at > org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:7 8) > at org.apache.hadoop.hbase.master.HMaster.doMain(HMaster.java:97 8) > at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:1022) > Caused by: java.io.EOFException > at java.io.DataInputStream.readInt(DataInputStream.java:375) > at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:493) > at org.apache.hadoop.ipc.Client$Connection.run(Client.java:43 8) > ------cut-------------- > > I hope that clear enough, Thanks! > > Regards, > Widan > > -- > --- > tobeThink! > www.tobethink.com > > Aligning IT and Education > > >> 021-99325243 > Y! : hawking_123 > Linkedln : http://www.linkedin.com/in/wildanmaulana >
