You need to instantiate 1 HTable per Map task, then reuse it for every
map() invocations. Sharing the actual object between JVMs isn't what
you want to do.

J-D

On Mon, Dec 6, 2010 at 9:58 AM, rajgopalv <[email protected]> wrote:
>
> Hi,
> I'm writing a MR job to read values from a CSV file and insert it into hbase
> using Htable.put()
> So each map function will insert one row.
> There is no reduce function.
>
> But now, i open a htable instance inside  every map function. This is bad..
> i know...
> but how can I share a common htable instance between multiple map jobs. ?
> --
> View this message in context: 
> http://old.nabble.com/Maps-sharing-a-common-table.-tp30366727p30366727.html
> Sent from the HBase User mailing list archive at Nabble.com.
>
>

Reply via email to