Thanks, using the new hbase-default.xml solved the problem

-Yair

-----Original Message-----
From: Jean-Daniel Cryans [mailto:[EMAIL PROTECTED] 
Sent: Monday, July 28, 2008 11:53 AM
To: [email protected]
Subject: Re: problem starting master with hbase 0.2.0

Yair,

I guess that your conf folder is separated from your binaries. This is a
good thing to do but you have to update hbase-default.xml with the one
provided in 0.2.0 to prevent errors like this (a class package that
changed).

J-D

On Mon, Jul 28, 2008 at 12:49 PM, Yair Even-Zohar
<[EMAIL PROTECTED]>wrote:

> I have similar setup to the 0.1.2 but started from a clean startup.
>
> The hadoop 1.7.0 seems to be working fine.
>
>
>
> When I start the hbase master I get:
> java.lang.reflect.InvocationTargetException but the weirdest thing is
> that I also get:
>
>  Caused by: java.lang.UnsupportedOperationException: Unable to find
> region server interface org.apache.hadoop.hbase.HRegionInterface
>
>
>
> and...
>
> Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.hbase.HRegionInte rface
>
>
>
>
>
> I copied the log below and below it are my hbase-sirte.xml and
> hadoop-site.xml
>
>
>
>
>
> Thanks
>
> -Yair
>
>
>
>
>
>
>
>
>
>
>
>
>
> Her is the full hbase log
>
>
>
> Sun Jul 27 19:02:31 PDT 2008 Starting master on
> sb-centercluster01.sb.dmtest.com
>
> java version "1.5.0_10"
>
> Java(TM) 2 Runtime Environment, Standard Edition (build 1.5.0_10-b03)
>
> Java HotSpot(TM) 64-Bit Server VM (build 1.5.0_10-b03, mixed mode)
>
> ulimit -n 1024
>
> 2008-07-27 19:02:32,912 INFO org.apache.hadoop.hbase.master.HMaster:
> Root region
>
>  dir: hdfs://sb-centercluster01:9000/hbase/-ROOT-/70236052
>
> 2008-07-27 19:02:33,322 INFO org.apache.hadoop.hbase.master.HMaster:
> BOOTSTRAP:
>
> creating ROOT and first META regions
>
> 2008-07-27 19:02:33,383 INFO
org.apache.hadoop.hbase.regionserver.HLog:
> New log
>
> writer created at /hbase/-ROOT-/70236052/log/hlog.dat.1217210553339
>
> 2008-07-27 19:02:33,440 INFO
> org.apache.hadoop.hbase.regionserver.HRegion: regio
>
> n -ROOT-,,0/70236052 available
>
> 2008-07-27 19:02:33,459 INFO
org.apache.hadoop.hbase.regionserver.HLog:
> New log
>
> writer created at /hbase/.META./1028785192/log/hlog.dat.1217210553452
>
> 2008-07-27 19:02:33,514 INFO
> org.apache.hadoop.hbase.regionserver.HRegion: regio
>
> n .META.,,1/1028785192 available
>
> 2008-07-27 19:02:33,570 INFO org.apache.hadoop.util.NativeCodeLoader:
> Loaded the
>
>  native-hadoop library
>
> 2008-07-27 19:02:33,572 INFO
> org.apache.hadoop.io.compress.zlib.ZlibFactory: Suc
>
> cessfully loaded & initialized native-zlib library
>
> 2008-07-27 19:02:33,983 INFO
> org.apache.hadoop.hbase.regionserver.HRegion: close
>
> d -ROOT-,,0
>
> 2008-07-27 19:02:34,014 INFO
> org.apache.hadoop.hbase.regionserver.HRegion: close
>
> d .META.,,1
>
> 2008-07-27 19:02:34,078 INFO org.apache.hadoop.ipc.metrics.RpcMetrics:
> Initializ
>
> ing RPC Metrics with hostName=60002, port=60002
>
> 2008-07-27 19:02:34,238 ERROR org.apache.hadoop.hbase.master.HMaster:
> Can not st
>
> art master
>
> java.lang.reflect.InvocationTargetException
>
>        at
sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>
>        at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstruct
>
> orAccessorImpl.java:39)
>
>        at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingC
>
> onstructorAccessorImpl.java:27)
>
>        at
> java.lang.reflect.Constructor.newInstance(Constructor.java:494)
>
>        at
> org.apache.hadoop.hbase.master.HMaster.doMain(HMaster.java:798)
>
>        at
org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:832)
>
> Caused by: java.lang.UnsupportedOperationException: Unable to find
> region server
>
>  interface org.apache.hadoop.hbase.HRegionInterface
>
>        at
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.<init>
>
> (HConnectionManager.java:162)
>
>        at
> org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConn
>
> ectionManager.java:88)
>
>        at
> org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:228)
>
>        at
> org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:148)
>
>        ... 6 more
>
> Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.hbase.HRegionInte
>
> rface
>
>        at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
>
>        at java.security.AccessController.doPrivileged(Native Method)
>
>        at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
>
>        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>
>        at
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268)
>
>        at java.lang.ClassLoader.loadClass(ClassLoader.java:251)
>
>        at
java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
>
>        at java.lang.Class.forName0(Native Method)
>
>        at java.lang.Class.forName(Class.java:164)
>
>        at
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.<init>
>
> (HConnectionManager.java:158)
>
>        ... 9 more
>
>
>
>
>
> My hbase-site.xml
>
>
>
> <configuration>
>
> <property>
>
>    <name>hbase.master</name>
>
>    <value>sb-centercluster01:60002</value>
>
>    <description>The host and port that the HBase master runs at.
>
>    </description>
>
>  </property>
>
>
>
>  <property>
>
>    <name>hbase.rootdir</name>
>
>    <value>hdfs://sb-centercluster01:9000/hbase</value>
>
>    <description>The directory shared by region servers.
>
>    </description>
>
>  </property>
>
>  <property>
>
>    <name>hbase.io.index.interval</name>
>
>    <value>8</value>
>
>  </property>
>
>  <property>
>
>    <name>hbase.hregion.max.filesize</name>
>
>    <value>67108864</value>
>
>  </property>
>
> </configuration>
>
>
>
>
>
> Hadoop-site.xml
>
>
>
> <configuration>
>
>  <property>
>
>    <name>fs.default.name</name>
>
>    <value>sb-centercluster01:9000</value>
>
>  </property>
>
>  <property>
>
>    <name>mapred.job.tracker</name>
>
>    <value>sb-centercluster01:9001</value>
>
>  </property>
>
>  <property>
>
>    <name>mapred.map.tasks</name>
>
>    <value>80</value>
>
>  </property>
>
>  <property>
>
>    <name>mapred.reduce.tasks</name>
>
>    <value>16</value>
>
>  </property>
>
>  <property>
>
>    <name>dfs.replication</name>
>
>    <value>3</value>
>
>  </property>
>
>  <property>
>
>    <name>dfs.name.dir</name>
>
>    <value>/home/hadoop/dfs,/tmp/hadoop/dfs</value>
>
>  </property>
>
>  <property>
>
>    <name>dfs.data.dir</name>
>
>    <value>/state/partition1/hadoop/dfs</value>
>
>  </property>
>
> <property>
>
>  <name>mapred.child.java.opts</name>
>
>  <value>-Xmx1024m</value>
>
>  </description>
>
> </property>
>
>
>
> </configuration>
>
>

Reply via email to