at least in 19, conf.setSpeculativeExecution(false) is the equivalent of
conf.setMapSepeculativeExecution(false);
conf.setReduceSpeculativeExecution(false);

On Tue, Jul 7, 2009 at 4:05 AM, Thibaut_ <[email protected]> wrote:

>
> Hi,
>
> conf.setSpeculativeExecution(false);
>
> or
>
> conf.setMapSpeculativeExecution(false);
> conf.setReduceSpeculativeExecution(false);
>
> Thibaut
>
>
> marcusherou wrote:
> >
> > Hi.
> >
> > I've noticed that hadoop spawns parallell copies of the same task on
> > different hosts. I've understood that this is due to improve the
> > performance
> > of the job by prioritizing fast running tasks. However since we in our
> > jobs
> > connect to databases this leads to conflicts when inserting, updating,
> > deleting data (duplicated key etc). Yes I know I should consider Hadoop
> as
> > a
> > "Shared Nothing" architecture but I really must connect to databases in
> > the
> > jobs. I've created a sharded DB solution which scales as well or I would
> > be
> > doomed...
> >
> > Any hints of how to disable this feature or howto reduce the impact of it
> > ?
> >
> > Cheers
> >
> > /Marcus
> >
> > --
> > Marcus Herou CTO and co-founder Tailsweep AB
> > +46702561312
> > [email protected]
> > http://www.tailsweep.com/
> >
> >
>
> --
> View this message in context:
> http://www.nabble.com/Parallell-maps-tp24303607p24371262.html
> Sent from the Hadoop core-user mailing list archive at Nabble.com.
>
>


-- 
Pro Hadoop, a book to guide you from beginner to hadoop mastery,
http://www.amazon.com/dp/1430219424?tag=jewlerymall
www.prohadoopbook.com a community for Hadoop Professionals

Reply via email to