BatchUpdate is deprecated and gone after 0.20, also the name was misleading because it was batching edits on multiple columns but not rows.
If I'm guessing correctly, you want to do an initial import of your data? The brute force way is to write a MR job but I would first recommend that you look into using the bulk uploader tools such as http://hbase.apache.org/docs/r0.89.20100924/bulk-loads.html J-D On Mon, Jan 10, 2011 at 10:10 AM, Weishung Chung <[email protected]> wrote: > Thank you :) > Could I use org.apache.hadoop.hbase.io.BatchUpdate ? Would it be faster than > the put(List<Put>)? > Also, would you recommend the use of MapReduce to accomplish the samething? > > On Mon, Jan 10, 2011 at 11:38 AM, Jean-Daniel Cryans > <[email protected]>wrote: > >> HBaseHUT is used to solve he Get+Put problem, so if it's your problem >> as well then do look into it. >> >> To answer your first question, that method will group Puts by region >> server meaning that it will do anywhere between 1-n where n is the >> number of RS, and that's done in parallel. >> >> J-D >> >> On Mon, Jan 10, 2011 at 9:06 AM, Weishung Chung <[email protected]> >> wrote: >> > What is the difference between the above put method with the following >> > capability of the HBaseHUT package ? >> > https://github.com/sematext/HBaseHUT >> > >> > On Mon, Jan 10, 2011 at 10:58 AM, Weishung Chung <[email protected]> >> wrote: >> > >> >> Does HTable.put(List<Put> puts) method perform a batch insert with a >> single >> >> RPC call? I am going to insert a lot of values into a column family and >> >> would like to increase the write speed. >> >> Thank you. >> >> >> > >> >
