Thenka Sean . you are right. If driver program is running then I can handle
shutdown in main exit path  . But if Driver machine is crashed (if you just
stop the application, for example killing the driver process ), then
Shutdownhook is the only option isn't it ? What I try to say is , just
doing ssc.stop in  sys.ShutdownHookThread  or
 Runtime.getRuntime().addShutdownHook ( in java) wont work anymore. I need
to use the Utils.addShutdownHook with a priority .. So just checking if
Spark Streaming can make graceful shutdown as default shutdown mechanism.

Dibyendu

On Tue, May 19, 2015 at 1:03 PM, Sean Owen <so...@cloudera.com> wrote:

> I don't think you should rely on a shutdown hook. Ideally you try to
> stop it in the main exit path of your program, even in case of an
> exception.
>
> On Tue, May 19, 2015 at 7:59 AM, Dibyendu Bhattacharya
> <dibyendu.bhattach...@gmail.com> wrote:
> > You mean to say within Runtime.getRuntime().addShutdownHook I call
> > ssc.stop(stopSparkContext  = true, stopGracefully  = true) ?
> >
> > This won't work anymore in 1.4.
> >
> > The SparkContext got stopped before Receiver processed all received
> blocks
> > and I see below exception in logs. But if I add the Utils.addShutdownHook
> > with the priority as I mentioned , then only graceful shutdown works . In
> > that case shutdown-hook run in priority order.
> >
>

Reply via email to