Nishant Khurana wrote:
Hi,
Thanks for pointing me to the new link. A google search on Hbase Shell was
returning the old one at the top so was trying to use that.
I've capitalized and tripled the banner at the top of that page that
tries to say this old shell no longer applies.
Sp from your
answers below, I think we always have to define column families in the
schema first and then alter table to add fully qualified column names to add
multiple columns to a column families.
No need to do table 'alter'. Just add new columns when you do an
update. It'll just work as long as the column has an existing column
family's prefix.
St.Ack
I will try to use the scripts as mentioned there.
Thanks
On Sat, Nov 22, 2008 at 1:47 AM, Michael Stack <[EMAIL PROTECTED]> wrote:
Nishant Khurana wrote:
Hi,
I was looking at the API Docs and HQL shell to create tables for Hbase. I
couldn't figure out :
1) How to create column families such that each column family contains
more
than one column through shell.
Please tell us how you tripped over HQL. It was remove from hbase quite a
while ago. We need to fix our documentation links if folks are continuing
to trip over it.
HBase shell is documented here, off the main wiki page for hbase:
http://wiki.apache.org/hadoop/Hbase/Shell.
To answer your question, there is nothing special to do adding new columns;
just create the column family and then add columns. As long as the columns
carry an existing column family prefix, hbase won't complain.
To create a table 'x', with a column family 'x', do this in the shell:
create 'x', 'x'
Type 'help' in the shell for more info.
2) Also if I want to do it programmatically, HColumnDescriptor class only
lets you create column families with single columns. Is there a way to do
it
?
See above. Yes, you only create the column family when defining your
schema. Thereafter you can add columns without modifying schema.
Also If I am not wrong, we can store multiple values of each column in
each
column family ?
Yes.
Is there a way I can do a bulk import from a structured text file into
Hbase
through the shell or any script already available
Usually folks write mapreduce jobs to load up hbase. If you want to write
a script to load hbase, see the aforementioned shell link. Has some notes
on how you to script the shell or on how to write a script in jruby.
Otherwise, there are python/jython examples here:
http://wiki.apache.org/hadoop/Hbase/Shell
St.Ack