That's not really something that should be done by a Spark user since there
isn't a public API to directly launch, cancel, and clean up after stages --
and doing so internally within Spark requires some knowledge and concern
for how stages are created, tracked, controlled, and coordinated between
the DAGScheduler and the ClusterScheduler.

With 0.8.1, there is a public means to do cancelation at the job level.
 See SparkContext#setJobGroup and SparkContext#cancelJobGroup.



On Thu, Dec 19, 2013 at 12:30 AM, [email protected] <
[email protected]> wrote:

> how to kill a stage in spark when I have know the stage id.
>
>
>
> ------------------------------------------------------
> [email protected]
>
>
>
>

Reply via email to