You can utilize a counter in external storage (NoSQL e.g.)
When the counter reaches 2, stop throwing exception so that the task passes.

FYI

On Sun, Jun 19, 2016 at 3:22 AM, Jacek Laskowski <ja...@japila.pl> wrote:

> Hi,
>
> Thanks Burak for the idea, but it *only* fails the tasks that
> eventually fail the entire job not a particular stage (just once or
> twice) before the entire job is failed. The idea is to see the
> attempts in web UI as there's a special handling for cases where a
> stage failed once or twice before finishing up properly.
>
> Any ideas? I've got one but it requires quite an extensive cluster set
> up which I'd like to avoid if possible. Just something I could use
> during workshops or demos and others could reproduce easily to learn
> Spark's internals.
>
> Pozdrawiam,
> Jacek Laskowski
> ----
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
>
> On Sun, Jun 19, 2016 at 5:25 AM, Burak Yavuz <brk...@gmail.com> wrote:
> > Hi Jacek,
> >
> > Can't you simply have a mapPartitions task throw an exception or
> something?
> > Are you trying to do something more esoteric?
> >
> > Best,
> > Burak
> >
> > On Sat, Jun 18, 2016 at 5:35 AM, Jacek Laskowski <ja...@japila.pl>
> wrote:
> >>
> >> Hi,
> >>
> >> Following up on this question, is a stage considered failed only when
> >> there is a FetchFailed exception? Can I have a failed stage with only
> >> a single-stage job?
> >>
> >> Appreciate any help on this...(as my family doesn't like me spending
> >> the weekend with Spark :))
> >>
> >> Pozdrawiam,
> >> Jacek Laskowski
> >> ----
> >> https://medium.com/@jaceklaskowski/
> >> Mastering Apache Spark http://bit.ly/mastering-apache-spark
> >> Follow me at https://twitter.com/jaceklaskowski
> >>
> >>
> >> On Sat, Jun 18, 2016 at 11:53 AM, Jacek Laskowski <ja...@japila.pl>
> wrote:
> >> > Hi,
> >> >
> >> > I'm trying to see some stats about failing stages in web UI and want
> >> > to "create" few failed stages. Is this possible using spark-shell at
> >> > all? Which setup of Spark/spark-shell would allow for such a scenario.
> >> >
> >> > I could write a Scala code if that's the only way to have failing
> >> > stages.
> >> >
> >> > Please guide. Thanks.
> >> >
> >> > /me on to reviewing the Spark code...
> >> >
> >> > Pozdrawiam,
> >> > Jacek Laskowski
> >> > ----
> >> > https://medium.com/@jaceklaskowski/
> >> > Mastering Apache Spark http://bit.ly/mastering-apache-spark
> >> > Follow me at https://twitter.com/jaceklaskowski
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> >> For additional commands, e-mail: user-h...@spark.apache.org
> >>
> >
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to