Hi Jacek,

Can't you simply have a mapPartitions task throw an exception or something?
Are you trying to do something more esoteric?

Best,
Burak

On Sat, Jun 18, 2016 at 5:35 AM, Jacek Laskowski <ja...@japila.pl> wrote:

> Hi,
>
> Following up on this question, is a stage considered failed only when
> there is a FetchFailed exception? Can I have a failed stage with only
> a single-stage job?
>
> Appreciate any help on this...(as my family doesn't like me spending
> the weekend with Spark :))
>
> Pozdrawiam,
> Jacek Laskowski
> ----
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
>
> On Sat, Jun 18, 2016 at 11:53 AM, Jacek Laskowski <ja...@japila.pl> wrote:
> > Hi,
> >
> > I'm trying to see some stats about failing stages in web UI and want
> > to "create" few failed stages. Is this possible using spark-shell at
> > all? Which setup of Spark/spark-shell would allow for such a scenario.
> >
> > I could write a Scala code if that's the only way to have failing stages.
> >
> > Please guide. Thanks.
> >
> > /me on to reviewing the Spark code...
> >
> > Pozdrawiam,
> > Jacek Laskowski
> > ----
> > https://medium.com/@jaceklaskowski/
> > Mastering Apache Spark http://bit.ly/mastering-apache-spark
> > Follow me at https://twitter.com/jaceklaskowski
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to