Awesome! That¹s exactly what I needed. Is there any estimated timeline for
0.8.1 release?

Mingyu

From:  Mark Hamstra <[email protected]>
Reply-To:  "[email protected]"
<[email protected]>
Date:  Wednesday, November 20, 2013 at 4:06 AM
To:  user <[email protected]>
Subject:  Re: Job cancellation

Job cancellation has been in both 0.8.1 SNAPSHOT and 0.9.0 SNAPSHOT for
awhile now: PR29 <https://github.com/apache/incubator-spark/pull/29> , PR74
<https://github.com/apache/incubator-spark/pull/74> .
Modification/improvement of job cancellation is part of the open pull
request PR190 <https://github.com/apache/incubator-spark/pull/190> .


On Wed, Nov 20, 2013 at 3:39 AM, Mingyu Kim <[email protected]> wrote:
> Hi all,
> 
> Cancellation seems to be supported at application level. In other words, you
> can call stop() on your instance of SparkContext in order to stop the
> computation associated with the SparkContext. Is there any way to cancel a
> job? (To be clear, job is "a parallel computation consisting of multiple tasks
> that gets spawned in response to a Spark action² as defined on the Spark
> website.) The current RDD API doesn¹t seem to provide this functionality, but
> I¹m wondering if there is any way to do anything similar. I¹d like to be able
> to cancel a long-running job that is found to be unnecessary without shutting
> down the SparkContext.
> 
> If there is no way to simulate the cancellation currently, is there any plan
> to support this functionality? Or, is this just not part of the design or
> desired uses of SparkContext?
> 
> Thanks!
> 
> Mingyu



Attachment: smime.p7s
Description: S/MIME cryptographic signature

Reply via email to