For more information, I have retried to create a table on the hbase shell,
surprisingly it did not work (previously it was). From the log file I saw
the error:

*2009-07-28 16:26:30,291 INFO org.apache.hadoop.hbase.master.HMaster:
Waiting for dfs to exit safe mode...
2009-07-28 16:26:40,313 INFO org.apache.hadoop.hbase.master.HMaster: Waiting
for dfs to exit safe mode...
2009-07-28 16:26:50,316 INFO org.apache.hadoop.hbase.master.HMaster: Waiting
for dfs to exit safe mode...
2009*

moreover, whenever I try to stop the hbase master boing "bin/stop-hbase.sh"
it never
succeeds to stop (previously it was)!!!! I don't know what's wrong!!!



On Tue, Jul 28, 2009 at 12:20 PM, Xine Jar <[email protected]> wrote:

> Thanks I can compile it now.
>
> But I still have a problem.
>
> *The hadoop 0.19.1 cluster is running on four nodes: *
> -X.X.X.92=data node
> -X.X.X.85=jobtracker
> -X.X.X.72=slave
> -X.X.X.58=slave
>
> *I configured the hbase 0.19.3 running on this cluster: *
> -X.X.X.72: The hbase master having its hbase.rootdir pointing on the X.92
> -X.X.X.58: a regionserver.
>
> From the hbase shell on X.X.X.72 I could created and modify tables.
>
> *Problem*:
> Now I compiled the MyClient.Java program from the node X.X.X.72 it worked
> thanks to your help. I tried to run it with the command "bin/hbase MyClient"
> it gave me the following error
>
> *2009-07-28 11:53:42,226 INFO
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers: getMaster
> attempt 0 of 10 failed; retrying after sleep of 2000
> java.net.ConnectException: Call to
> pc72.test.mobnets.rwth-aachen.de/134.130.223.72:9001 failed on connection
> exception: java.net.ConnectException: Connection refused
>     at
> org.apache.hadoop.hbase.ipc.HBaseClient.wrapException(HBaseClient.java:728)
>     at org.apache.hadoop.hbase.ipc.HBaseClient.call(HBaseClient.java:704)
>     at
> org.apache.hadoop.hbase.ipc.HBaseRPC$Invoker.invoke(HBaseRPC.java:321)
>     at $Proxy0.getProtocolVersion(Unknown Source)
>     at org.apache.hadoop.hbase.ipc.HBaseRPC.getProxy(HBaseRPC.java:467)
>     at org.apache.hadoop.hbase.ipc.HBaseRPC.getProxy(HBaseRPC.java:443)
>     at org.apache.hadoop.hbase.ipc.HBaseRPC.getProxy(HBaseRPC.java:491)
>     at
> org.apache.hadoop.hbase.client.HConnectionManager$TableServers.getMaster(HConnectionManager.java:207)
>     at org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:70)
>     at org.apache.hadoop.hbase.master.HMaster.doMain(HMaster.java:1033)
>     at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:1057)
> Caused by: java.net.ConnectException: Connection refused
>     at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>     at
> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:574)
>     at sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:100)
>     at
> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.setupIOstreams(HBaseClient.java:304)
>     at
> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.access$1700(HBaseClient.java:181)
>     at
> org.apache.hadoop.hbase.ipc.HBaseClient.getConnection(HBaseClient.java:805)
>     at org.apache.hadoop.hbase.ipc.HBaseClient.call(HBaseClient.java:690)
>     ... 9 more
> 2009-07-28 11:53:45,232 INFO org.apache.hadoop.ipc.HBaseClass: Retrying
> connect to server: pc72.test.mobnets.rwth-aachen.de/134.130.223.72:9001.
> Already tried 0 time(s).
> 2009-07-28 11:53:46,236 INFO org.apache.hadoop.ipc.HBaseClass: Retrying
> connect to server: pc72.test.mobnets.rwth-aachen.de/134.130.223.72:9001.
> Already tried 1 time(s).
> 2009-07-28 11:53:47,240 INFO org.apache.hadoop.ipc.HBaseClass: Retrying
> connect to server: pc72.test.mobnets.rwth-aachen.de/13
>
>
> **Questions:**
> *-I doubt that my configuration for hbase is correct. Can the hbase master
> run on any of  the four nodes even on the datanode or Jobtracker nodes? or
> should it be specifically one of the slaves as I did?
>
> -In case the hbase master is one of the nodes. The file "regionservers"
> shall contains all the rest of nodes? or shall I exclude the Jobtracker and
> datanode addresses?
>
> -Finally any idea if my error is generated from my hbase configuration? or
> from something else?
>
> Thank you for answering my questions.
> CJ
> *
> *
>
> On Mon, Jul 27, 2009 at 8:02 PM, tim robertson 
> <[email protected]>wrote:
>
>> Add the hadoop.jar to the classpath when you are compiling and you should
>> be ok.
>>
>> javac -classpath ${HBASE_HOME}/hbase-${HBASE_VERSION}.jar:
>> ${HADOOP_HOME}/hadoop-${HADOOP_VERSION}.jar -d test_classes...
>>
>> The config you changed provides the paths at runtime, but not at compile
>> time.
>>
>> Cheers
>>
>> Tim
>>
>> On Mon, Jul 27, 2009 at 7:59 PM, Xine Jar<[email protected]>
>> wrote:
>> > Hallo I am running the hadoop version 0.19.1 on a cluster of 4 nodes.
>> The
>> > hadoop is working fine.
>> > I have installed as well the hbase version 0.19.3 following the
>> guideline in
>> > http://hadoop.apache.org/hbase/docs/r0.19.3/api/overview-summary.html.
>> I
>> > managed to configure the hbase on the cluster and I could even create
>> tables
>> > using the hbase shell.
>> >
>> > *I know that my problem resides in a path but I don't know how to
>> correct it
>> > :**
>> > *I have copied the Java program MyClient.java from above mentioned link
>> and
>> > tried to compile it from the hbase Master:
>> >
>> > pc72:~/Desktop/hbase-0.19.3 #
>> > javac -classpath ${HBASE_HOME}/hbase-${HBASE_VERSION}.jar -d
>> test_classes
>> > MyClient.java
>> >
>> > *I got the following errors:*
>> > MyClient.java:9: package org.apache.hadoop.conf does not exist
>> > import org.apache.hadoop.conf.Configuration;
>> >                             ^
>> > MyClient.java:45: cannot access org.apache.hadoop.io.WritableComparable
>> > class file for org.apache.hadoop.io.WritableComparable not found
>> >    table.commit(batchUpdate);
>> >         ^
>> > 2 errors
>> >
>> > *What I did not forget to do*:
>> > I Fulfilled the two required steps:
>> >
>> >   - Add a pointer to your HADOOP_CONF_DIR to CLASSPATH in hbase-env.sh
>> >
>> >         My hbase-env.sh looks like this:
>> >
>> >          export JAVA_HOME="/usr/lib64/jvm/java"
>> >          export HBASE_HOME="/root/Desktop/hbase-0.19.3"
>> >          export HBASE_VERSION="0.19.3"
>> >          export HBASE_CLASSPATH="/root/Desktop/hadoop-0.19.1/conf"
>> >
>> >
>> >
>> >   - Add a copy of hadoop-site.xml to ${HBASE_HOME}/conf,
>> >
>> >      I have copied the file
>> >
>> >  *Am I compiling wrong?Can anyone tell me if I forgot a path somewhere?
>> **
>> > **
>> > I appreciate any help.
>> >
>> >
>> > thank you*
>> >
>>
>
>

Reply via email to