Re: How can I tell if a Spark job is successful or not?

2017-08-10 Thread Ryan
you could exit with error code just like normal java/scala application, and
get it from driver/yarn

On Fri, Aug 11, 2017 at 9:55 AM, Wei Zhang 
wrote:

> I suppose you can find the job status from Yarn UI application view.
>
>
>
> Cheers,
>
> -z
>
>
>
> *From:* 陈宇航 [mailto:yuhang.c...@foxmail.com]
> *Sent:* Thursday, August 10, 2017 5:23 PM
> *To:* user 
> *Subject:* How can I tell if a Spark job is successful or not?
>
>
>
> I want to do some clean-ups after a Spark job is finished, and the action
> I would do depends on whether the job is successful or not.
>
> So how where can I get the result for the job?
>
> I already tried the SparkListener, it worked fine when the job is
> successful, but if the job fails, the listener seems not called.
>


RE: How can I tell if a Spark job is successful or not?

2017-08-10 Thread Wei Zhang
I suppose you can find the job status from Yarn UI application view.

Cheers,
-z

From: 陈宇航 [mailto:yuhang.c...@foxmail.com]
Sent: Thursday, August 10, 2017 5:23 PM
To: user 
Subject: How can I tell if a Spark job is successful or not?


I want to do some clean-ups after a Spark job is finished, and the action I 
would do depends on whether the job is successful or not.

So how where can I get the result for the job?

I already tried the SparkListener, it worked fine when the job is successful, 
but if the job fails, the listener seems not called.