Job cancellation has been in both 0.8.1 SNAPSHOT and 0.9.0 SNAPSHOT for
awhile now: PR29 <https://github.com/apache/incubator-spark/pull/29>,
PR74<https://github.com/apache/incubator-spark/pull/74>.
 Modification/improvement of job cancellation is part of the open pull
request PR190 <https://github.com/apache/incubator-spark/pull/190>.


On Wed, Nov 20, 2013 at 3:39 AM, Mingyu Kim <[email protected]> wrote:

> Hi all,
>
> Cancellation seems to be supported at application level. In other words,
> you can call stop() on your instance of SparkContext in order to stop the
> computation associated with the SparkContext. Is there any way to cancel a
> job? (To be clear, job is "a parallel computation consisting of multiple
> tasks that gets spawned in response to a Spark action” as defined on the
> Spark website.) The current RDD API doesn’t seem to provide this
> functionality, but I’m wondering if there is any way to do anything
> similar. I’d like to be able to cancel a long-running job that is found to
> be unnecessary without shutting down the SparkContext.
>
> If there is no way to simulate the cancellation currently, is there any
> plan to support this functionality? Or, is this just not part of the design
> or desired uses of SparkContext?
>
> Thanks!
>
> Mingyu
>

Reply via email to