Hi Sriram, You can use bulk upload utility to insert massive data in HBase. http://archive.cloudera.com/cdh/3/hbase/bulk-loads.html
There is an alternative I use in my application where I had to insert large amount of data from map-reduce job. I use async hbase client( https://github.com/stumbleupon/asynchbase) instead of HTable(which I suppose TableMapReduceUtil will internally be using) and found async client quite faster. Regards, Dhaval On Wed, Aug 31, 2011 at 9:44 PM, Stack <[email protected]> wrote: > On Wed, Aug 31, 2011 at 1:06 AM, sriram <[email protected]> wrote: > > Error: unable to create new native thread > > Only 8200 values are inserted remaining lakhs of datas are not inserted > and the > > job failed.Any ideas or solutions.????? > > > > You are getting an OutOfMemoryError? Its coming from mapreduce or is > it from hbase? Can you give your processes more memory? What are > you trying to insert? Is it massive? > > St.Ack >
