I am using the windows distribution, it must have come with hadoop built
in as I did not install hadoop separately. The distribution however does
not have any hadoop commands or shell scripts.
So looks like this is a hbase windows distribution issue?
Andrew Purtell wrote:
To run HBase you must have installed Hadoop underneath. The Hadoop top level command line script 'bin/hadoop' is available in your Hadoop installation. Just put the Hadoop bin/ dir on the path along with the HBase bin/ dir.
You can also run some Hadoop applications using the HBase command script, but
that is missing the shortcuts, e.g. instead of
hadoop fs -copyFromLocal ...
it is
hbase org.apache.hadoop.fs.FsShell -copyFromLocal ...
This will work for DFS but for other filesystems which need supporting jars in lib/, your mileage will vary. Just put the Hadoop bin/ dir on the path along with the HBase bin/ dir and use the appropriate one for the circumstances.
- Andy
________________________________
From: Ravi <[email protected]>
To: [email protected]
Sent: Sun, November 29, 2009 8:28:58 AM
Subject: Re: Can I use hadoop inside hbase?
Thanks for the quick clarification Jeff.
In hbase distribution, I am unable to find hadoop command to load files
into DFS. How do I add hadoop commands to hbase distribution?
Jeff Zhang wrote:
Of course, you can. I run my pig script on the cluster and run hbase on the
same cluster, it works fine.
Jeff Zhang
On Sat, Nov 28, 2009 at 9:45 PM, Ravi <[email protected]> wrote:
If I setup a hbase cluster, say with 10 nodes, can I also use the same
cluster for my distributed program using the hadoop running underneath
hbase?
Essentially the thought is to use underlying hadoop for two purposes, data
as well as logic.
Any thoughts?