Hi all,

I have a scenario in which my Hbase tables will be fed with data size more than 
250GB every day. I have to do analysis on that data using MR jobs and save the 
output in Hbase table itself.

1.      My concern is will Hbase be able to handle such data as it is built to 
handle big data?

2.      What hardware /hbase configuration points I must keep in mind to create 
a cluster for such requirement.?

a.      How many region server?

b.      How many regions per region server ?

3.      My schema is such that in one table with one cf , there will be 
millions of column qualifier. What can be the consequences of such design.

Please suggest.

Regards,
Stuti Awasthi
HCL Comnet Systems and Services Ltd
F-8/9 Basement, Sec-3,Noida.


________________________________
::DISCLAIMER::
-----------------------------------------------------------------------------------------------------------------------

The contents of this e-mail and any attachment(s) are confidential and intended 
for the named recipient(s) only.
It shall not attach any liability on the originator or HCL or its affiliates. 
Any views or opinions presented in
this email are solely those of the author and may not necessarily reflect the 
opinions of HCL or its affiliates.
Any form of reproduction, dissemination, copying, disclosure, modification, 
distribution and / or publication of
this message without the prior written consent of the author of this e-mail is 
strictly prohibited. If you have
received this email in error please delete it and notify the sender 
immediately. Before opening any mail and
attachments please check them for viruses and defect.

-----------------------------------------------------------------------------------------------------------------------

Reply via email to