You can add a shutdown hook to your JVM and request spark streaming context to stop gracefully.
/** * Shutdown hook to shutdown JVM gracefully * @param ssCtx */ def addShutdownHook(ssCtx: StreamingContext) = { Runtime.getRuntime.addShutdownHook( new Thread() { override def run() = { println("In shutdown hook") // stop gracefully ssCtx.stop(true, true) } }) } } Pankaj On Fri, Dec 22, 2017 at 9:56 AM, Toy <noppani...@gmail.com> wrote: > I'm trying to write a deployment job for Spark application. Basically the > job will send yarn application --kill app_id to the cluster but after the > application received the signal it dies without finishing whatever is > processing or stopping the stream. > > I'm using Spark Streaming. What's the best way to stop Spark application > so we won't lose any data. > > >