Hello MS,

Re file systems: while HBase can theoretically run on other scalable file 
systems, I remember somebody on the HBase list saying, in effect, that unless 
you are a file system guru and willing to put in a heck of a lot of work, the 
only practical choice as an underlying file system is Hadoop's HDFS. I think 
that was something like half a year ago or more, so maybe things have changed.  
Any of the HBase developers on the HBase list have an update (or a correction 
to my recollection)?

Ron

Ronald Taylor, Ph.D.
Computational Biology & Bioinformatics Group
Pacific Northwest National Laboratory (U.S. Dept of Energy/Battelle)
Richland, WA 99352
phone: (509) 372-6568
email: [email protected]

From: M S Vishwanath Bhat [mailto:[email protected]]
Sent: Tuesday, August 16, 2011 12:29 AM
To: [email protected]
Subject: Re: Need Help with HBase

Hi,

Just need small clarification.

HBase is used only to create and maintain Big Tables. Like we can use HBase to 
create, append, extend etc etc.. And it runs on any file system. Like if we 
point  "rootdir" property in file hbase-site.xml to nfs mount point, it should 
still work. Habse doesn't even need Hadoop to create and maintain large tables. 
 BUT the significance of hadoop comes into the scene only when I want to run a 
map/reduce applications on a large table created by HBase.

Is my above understanding correct? Can anyone please explain if I am wrong?

Thanks,
MS
On 12 August 2011 00:31, Corey M. Dorwart 
<[email protected]<mailto:[email protected]>> wrote:
Hello MS-

Welcome to Hadoop MapReduce programming!

The first step is to follow the MapReduce tutorial on apache's website 
(http://hadoop.apache.org/common/docs/current/mapred_tutorial.html). Without 
much Java experience you are going to be at a disadvantage, but you are not 
alone. You may want to give Apache's Pig a go (http://pig.apache.org/). Pig is 
a much simpler way to program in MapReduce which more closely resembles a SQL 
language; Pig is an intermediate between you and MapReduce code. They have 
great tutorials on that as well.

Most MapReduce code is requirement specific but doing your first Word Count 
applications are simple and can be found readily on the web.

Good Luck!

-Corey

From: M S Vishwanath Bhat [mailto:[email protected]<mailto:[email protected]>]
Sent: Thursday, August 11, 2011 3:00 PM
To: [email protected]<mailto:[email protected]>
Subject: Need Help with HBase

Hi,

I'm a newbie to the Hadoop and Map/Reduce applications. I have set-up a cluster 
and just running the example map/reduce applications which comes with the 
Hadoop source code.

I want to run some more applications. But I'm not a java developer.

So if there's anyone who is willing to share the map/reduce applications they 
wrote, it would be of great help me. If you are willing to share please do so 
with me.


Thanks in Advance,


Cheers,
MS

Reply via email to