When i tried to restart entire hadoop the error says
2012-02-08 15:18:22,269 FATAL org.apache.hadoop.hbase.master.HMaster:
Unhandled exception. Starting shutdown.
java.lang.NoClassDefFoundError:
org/apache/hadoop/hdfs/protocol/FSConstants$SafeModeAction
at
org.apache.hadoop.hbase.util.FSUtils.waitOnSafeMode(FSUtils.java:439)
at
org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:323)
at
org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:128)
at
org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:113)
at
org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:448)
at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:326)
at java.lang.Thread.run(Thread.java:636)
Caused by: java.lang.ClassNotFoundException:
org.apache.hadoop.hdfs.protocol.FSConstants$SafeModeAction
at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
... 7 more
On Wed, Feb 8, 2012 at 3:06 PM, raghavendhra rahul <
[email protected]> wrote:
> Thanks for the help,
> Now i get the following error while starting Hmaster
>
> java.lang.NoSuchMethodError:
> org.apache.hadoop.ipc.RPC.getProxy(Ljava/lang/Class;JLjava/net/InetSocketAddress;Lorg/apache/hadoop/security/UserGroupInformatin,;Lorg/apache/hadoop/conf/Configuration;Ljavax/net/SocketFactory;)Lorg/apache/hadoop/ipc/VersionedProtocol;
> at
> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:119)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:238)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:203)
> at
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
> at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1936)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:76)
> at
> org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:1970)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1952)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:268)
> at org.apache.hadoop.fs.Path.getFileSystem(Path.java:189)
> at
> org.apache.hadoop.hbase.util.FSUtils.getRootDir(FSUtils.java:471)
> at
> org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:94)
> at
> org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:448)
> at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:326)
> at java.lang.Thread.run(Thread.java:636)
>
>
>
> On Wed, Feb 8, 2012 at 2:49 PM, Mingjie Lai <[email protected]> wrote:
>
>> hadoop 0.23+ ships with multiple jars instead of one hadoop-core-xxx.jar
>> in 0.20 or hadoop-1.
>>
>> And the jar files are under share directory.
>>
>> hadoop-0.23.0/share $ find . -name hadoop*.jar | grep -v source | grep -v
>> test
>> ./hadoop/common/hadoop-common-**0.23.0.jar
>> ./hadoop/common/lib/hadoop-**yarn-common-0.23.0.jar
>> ./hadoop/common/lib/hadoop-**yarn-api-0.23.0.jar
>> ./hadoop/common/lib/hadoop-**mapreduce-client-core-0.23.0.**jar
>> ./hadoop/common/lib/hadoop-**mapreduce-client-app-0.23.0.**jar
>> ./hadoop/common/lib/hadoop-**yarn-server-common-0.23.0.jar
>> ./hadoop/common/lib/hadoop-**mapreduce-client-common-0.23.**0.jar
>> ./hadoop/common/lib/hadoop-**auth-0.23.0.jar
>> ./hadoop/common/lib/hadoop-**mapreduce-client-jobclient-0.**23.0.jar
>> ./hadoop/hdfs/hadoop-hdfs-0.**23.0.jar
>>
>> So far you need to make sure all the jars are placed under you hbase/lib
>> directory.
>>
>>
>>
>> On 02/07/2012 10:51 PM, raghavendhra rahul wrote:
>>
>>> I have replaced the jar yet i get the following error
>>> Exception in thread "main" java.lang.**NoClassDefFoundError:
>>> org/apache/hadoop/util/**PlatformName
>>> Caused by: java.lang.**ClassNotFoundException:
>>> org.apache.hadoop.util.**PlatformName
>>> at java.net.URLClassLoader$1.run(**URLClassLoader.java:217)
>>> at java.security.**AccessController.doPrivileged(**Native Method)
>>> at java.net.URLClassLoader.**findClass(URLClassLoader.java:**205)
>>> at java.lang.ClassLoader.**loadClass(ClassLoader.java:**321)
>>> at sun.misc.Launcher$**AppClassLoader.loadClass(**Launcher.java:294)
>>> at java.lang.ClassLoader.**loadClass(ClassLoader.java:**266)
>>> Could not find the main class: org.apache.hadoop.util.**PlatformName.
>>> Program
>>> will exit.
>>> Exception in thread "main" java.lang.**NoClassDefFoundError:
>>> org/apache/hadoop/conf/**Configuration
>>> at java.lang.ClassLoader.**defineClass1(Native Method)
>>> at java.lang.ClassLoader.**defineClass(ClassLoader.java:**634)
>>> at
>>> java.security.**SecureClassLoader.defineClass(**
>>> SecureClassLoader.java:142)
>>> at java.net.URLClassLoader.**defineClass(URLClassLoader.**java:277)
>>> at java.net.URLClassLoader.**access$000(URLClassLoader.**java:73)
>>> at java.net.URLClassLoader$1.run(**URLClassLoader.java:212)
>>> at java.security.**AccessController.doPrivileged(**Native Method)
>>> at java.net.URLClassLoader.**findClass(URLClassLoader.java:**205)
>>> at java.lang.ClassLoader.**loadClass(ClassLoader.java:**321)
>>> at sun.misc.Launcher$**AppClassLoader.loadClass(**Launcher.java:294)
>>> at java.lang.ClassLoader.**loadClass(ClassLoader.java:**266)
>>> at
>>> org.apache.hadoop.hbase.util.**HBaseConfTool.main(**
>>> HBaseConfTool.java:38)
>>> Caused by: java.lang.**ClassNotFoundException:
>>> org.apache.hadoop.conf.**Configuration
>>> at java.net.URLClassLoader$1.run(**URLClassLoader.java:217)
>>> at java.security.**AccessController.doPrivileged(**Native Method)
>>> at java.net.URLClassLoader.**findClass(URLClassLoader.java:**205)
>>> at java.lang.ClassLoader.**loadClass(ClassLoader.java:**321)
>>> at sun.misc.Launcher$**AppClassLoader.loadClass(**Launcher.java:294)
>>> at java.lang.ClassLoader.**loadClass(ClassLoader.java:**266)
>>> ... 12 more
>>>
>>>
>>> On Wed, Feb 8, 2012 at 1:31 AM, Stack<[email protected]> wrote:
>>>
>>> On Tue, Feb 7, 2012 at 1:16 AM, raghavendhra rahul
>>>> <[email protected]> wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>> I tried installing hbase on top of hadoop
>>>>> yarn.I get the following error.Any suggestion
>>>>> client1: Exception in thread "main"
>>>>>
>>>> org.apache.hadoop.ipc.**RemoteException:
>>>>
>>>>> Server IPC version 5 cannot communicate with client version 3
>>>>> client1: at org.apache.hadoop.ipc.Client.**call(Client.java:740)
>>>>> client1: at org.apache.hadoop.ipc.RPC$**
>>>>> Invoker.invoke(RPC.java:220)
>>>>> client1: at $Proxy5.getProtocolVersion(**Unknown Source)
>>>>> client1: at org.apache.hadoop.ipc.RPC.**getProxy(RPC.java:359)
>>>>> client1: at
>>>>> org.apache.hadoop.hdfs.**DFSClient.createRPCNamenode(**
>>>>> DFSClient.java:106)
>>>>> client1: at
>>>>>
>>>> org.apache.hadoop.hdfs.**DFSClient.<init>(DFSClient.**java:207)
>>>>
>>>>> client1: at
>>>>>
>>>> org.apache.hadoop.hdfs.**DFSClient.<init>(DFSClient.**java:170)
>>>>
>>>>> client1: at
>>>>>
>>>>> org.apache.hadoop.hdfs.**DistributedFileSystem.**initialize(**
>>>> DistributedFileSystem.java:82)
>>>>
>>>>> client1: at
>>>>> org.apache.hadoop.fs.**FileSystem.createFileSystem(**
>>>>> FileSystem.java:1378)
>>>>>
>>>>
>>>>
>>>> You need to replace the hadoop jar that is under hbase lib with that
>>>> of the cluster you are trying to communicate with.
>>>>
>>>> Please read the reference guide. Its plain you have not.
>>>> http://hbase.apache.org/book.**html#getting_started<http://hbase.apache.org/book.html#getting_started>
>>>>
>>>> Thanks,
>>>> St.Ack
>>>>
>>>>
>>>
>