Hi Spark community,

I have a Spark streaming application that reads from a Kinesis stream and
processes data. It calls some services which can experience transient
failures. When such a transient failure happens, a retry mechanism kicks
into action.

For the shutdown use case, I have a separate thread which polls an SQS
queue for a shutdown message and calls SparkStreamingContext.stop(). So,
processing of a batch completes and the application shuts down. But when
the retry mechanism has kicked in, calling stop() does not shut down the
application since the current batch processing is not complete. Tried both
with stopGracefully astrue/false.

My use case dictates that the current batch processing be abandoned and the
application stop immediately. Am I missing something here?

I have asked the same question here
<http://stackoverflow.com/questions/39267637/force-stop-a-spark-streaming-application-running-on-emr>
as
well. Any help is appreciated.

Thanks,

Rajkiran

Reply via email to