Hi Dean,

The way we are doing it is by taking a dump of the data in text format and
then using TextInputFormat for the mapper and TableMapReduceUtil (
http://hbase.apache.org/docs/current/api/org/apache/hadoop/hbase/mapred/TableMapReduceUtil.html)
for the reducer to upload the rows one by one, one row to each mapper.

You can take a look at the example in SVN:
http://svn.apache.org/repos/asf/hbase/trunk/src/examples/mapreduce/org/apache/hadoop/hbase/mapreduce/SampleUploader.java

Hope it helps..
Hari

On Fri, Dec 17, 2010 at 10:07 PM, Hiller, Dean (Contractor) <
[email protected]> wrote:

> I am going to try to suck in data from a normal db to be able to
> prototype with our existing data.  I plan on writing a Map part of
> Map/Reduce to suck in that data.  I am very new to this, so I am trying
> to get started on this small example...
>
>
>
> I have 100 accounts and I will suck them down to two nodes to start with
> so I have a list I will divide in 2 and how do I send the right account
> numbers(which will be my keys for now until I get into this more and
> redesign it).  I see the Map method take the key, but how do I feed the
> list in so the Map methods will then get each key to process one by one
> and I assume the Map method jobs will run closest to where it was going
> to write the data into, correct?
>
>
>
> Am I way off base on my thinking here?  I am very knew to how hbase
> works.
>
>
>
> Thanks,
>
> Dean
>
>
> This message and any attachments are intended only for the use of the
> addressee and
> may contain information that is privileged and confidential. If the reader
> of the
> message is not the intended recipient or an authorized representative of
> the
> intended recipient, you are hereby notified that any dissemination of this
> communication is strictly prohibited. If you have received this
> communication in
> error, please notify us immediately by e-mail and delete the message and
> any
> attachments from your system.
>
>

Reply via email to