thanks, that helps!!

On Tue, Feb 28, 2012 at 7:02 AM, Tim Robertson <[email protected]>wrote:

> Hi,
>
> You can call context.write() multiple times in the Reduce(), to emit
> more than one row.
>
> If you are creating the Puts in the Map function then you need to
> setMapSpeculativeExecution(false) on the job conf, or else Hadoop
> *might* spawn more than 1 attempt for a given task, meaning you'll get
> duplicate data.
>
> HTH,
> Tim
>
>
>
> On Tue, Feb 28, 2012 at 3:51 PM, T Vinod Gupta <[email protected]>
> wrote:
> > Ben,
> > I didn't quite understand your concern? What speculative execution are
> you
> > referring to?
> >
> > thanks
> > vinod
> >
> > On Tue, Feb 28, 2012 at 6:45 AM, Ben Snively <[email protected]> wrote:
> >
> >> I think the short answer to that is yes, but the complex portion I
> would be
> >> worried about is the following:
> >>
> >>
> >> I guess along with that ,  how do manage speculative execution on the
> >> reducer (or is that only for map tasks)?
> >>
> >> I've always ended up creating import files and bringing it into HBase.
> >>
> >> Thanks,
> >> Ben
> >>
> >> On Tue, Feb 28, 2012 at 9:34 AM, T Vinod Gupta <[email protected]
> >> >wrote:
> >>
> >> > while doing map reduce on hbase tables, is it possible to do multiple
> >> puts
> >> > in the reducer? what i want is a way to be able to write multiple
> rows.
> >> if
> >> > its not possible, then what are the other alternatives? i mean like
> >> > creating a wider table in that case.
> >> >
> >> > thanks
> >> >
> >>
>

Reply via email to