Stuti, Also you can look into its source code to see how to upload data using your Mapred Job. -P
On Tue, Nov 15, 2011 at 1:26 PM, Harsh J <[email protected]> wrote: > Stuti, > > For simple delimited text files, like you describe, have you considered > simply using the ImportTSV tool to bulk load the data into HBase? > > See http://hbase.apache.org/bulk-loads.html > > On 15-Nov-2011, at 1:19 PM, Stuti Awasthi wrote: > > > Hi all, > > I have a file in HDFS with "KeyValueTextInputFormat". I want to process > this file through Map and put the output in Hbase. > > > > Input File : > > Key1 Val1 > > Key2 Val2 > > > > I am not sure which interfaces and classes to use i.e my customMapper > will extend MapReduceBase or TableMapper. > > Output will be stored in table say X with Set column family as : > > > > Put put = new Put(Bytes.toBytes(Key1)); > > put.add(Bytes.toBytes("Set"), Bytes.toBytes(Val1), null); > > > > Please suggest > > Regards, > > Stuti Awasthi > > > > > > ::DISCLAIMER:: > > > ----------------------------------------------------------------------------------------------------------------------- > > > > The contents of this e-mail and any attachment(s) are confidential and > intended for the named recipient(s) only. > > It shall not attach any liability on the originator or HCL or its > affiliates. Any views or opinions presented in > > this email are solely those of the author and may not necessarily > reflect the opinions of HCL or its affiliates. > > Any form of reproduction, dissemination, copying, disclosure, > modification, distribution and / or publication of > > this message without the prior written consent of the author of this > e-mail is strictly prohibited. If you have > > received this email in error please delete it and notify the sender > immediately. Before opening any mail and > > attachments please check them for viruses and defect. > > > > > ----------------------------------------------------------------------------------------------------------------------- > >
