Did you replace the hadoop jars that is under hbase/lib with those of the cluster you are connecting too? 0.96.0 bundles 2.1.0-beta and sounds like you want to connect to hadoop-2.2.0. The two hadoops need to be the same version (the above is a new variant on the mismatch message, each more cryptic than the last).
St.Ack On Wed, Oct 23, 2013 at 7:38 AM, Paul Honig <[email protected]>wrote: > Hi, > > I'm trying to get a Hbase 0.96 install to work with with Hadoop 2.2. > The region servers seems to be running fine, though the HMaster exits with > the following stacktrace. > > 2013-10-23 16:31:05,206 INFO [master:client1:60000] > master.ActiveMasterManager: Registered Active > Master=client1.local,60000,1382538662205 > 2013-10-23 16:31:05,232 WARN [master:client1:60000] conf.Configuration: > fs.default.name is deprecated. Instead, use fs.defaultFS > 2013-10-23 16:31:05,468 FATAL [master:client1:60000] master.HMaster: > Unhandled exception. Starting shutdown. > org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.RpcServerException): > Unknown out of band call #-2147483647 > at org.apache.hadoop.ipc.Client.call(Client.java:1347) > at org.apache.hadoop.ipc.Client.call(Client.java:1300) > at > org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206) > at com.sun.proxy.$Proxy12.setSafeMode(Unknown Source) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:606) > at > org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:188) > at > org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) > at com.sun.proxy.$Proxy12.setSafeMode(Unknown Source) > at > org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.setSafeMode(ClientNamenodeProtocolTranslatorPB.java:561) > at > org.apache.hadoop.hdfs.DFSClient.setSafeMode(DFSClient.java:2102) > at > org.apache.hadoop.hdfs.DistributedFileSystem.setSafeMode(DistributedFileSystem.java:994) > at > org.apache.hadoop.hdfs.DistributedFileSystem.setSafeMode(DistributedFileSystem.java:978) > at > org.apache.hadoop.hbase.util.FSUtils.isInSafeMode(FSUtils.java:433) > at > org.apache.hadoop.hbase.util.FSUtils.waitOnSafeMode(FSUtils.java:852) > at > org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:435) > at > org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:146) > at > org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:127) > at > org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:786) > at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:603) > at java.lang.Thread.run(Thread.java:724) > > I tried to Google the error, but no results. > Anybody an idea of what I'm doing wrong? > > Kind regards, Paul Honig >
