Hi Stack:

Thanks for the reply. Yes. I set the full-path to hbase/conf.

Actually, I printed out all environmental variables in my map/reduce
program, and I can see HADOOP_PATH is set to the right place (to hbase jar,
and hbase conf directory)

Just want to double check, if everything works fine, my map/reduce program
should look for <master_machine>:60000 instead of localhost:60000, right?

Regards

tp

On Wed, Sep 9, 2009 at 6:45 AM, stack <[email protected]> wrote:

> On Tue, Sep 8, 2009 at 6:13 PM, charles du <[email protected]> wrote:
>
> > Hi:
> >
> > I installed hbase 0.19.3 and hadoop 0.19.1. I tried to run the BulkImport
> > example on http://wiki.apache.org/hadoop/Hbase/MapReduce, get the
> > following
> > error.
> >
> >        "org.apache.hadoop.hbase.MasterNotRunningException:
> localhost:60000"
> >
> > From the error message, it looks like the hadoop looks at the wrong place
> > for the hbase configuration. I  added paths to hbase-0.19.3.jar,
> > hbase-0.19.3-test.jar, hbase/conf to ""export HADOOP_CLASSPATH =" in
> > 'hadoop-env.sh', populated the change to every hadoop machine, and
> > restarted
> > hadoop.
> >
> >
> It looks like you are doing all the right stuff and I'd agree with your
> speculation, that its not finding the hbase configuration.  You have set
> full paths to hbase/conf directory in HADOOP_CLASSPATH?
>
> St.Ack
>



-- 
tp

Reply via email to