Repository: spark
Updated Branches:
  refs/heads/master cf1d32e3e -> 32fad4233


[SPARK-3597][Mesos] Implement `killTask`.

The MesosSchedulerBackend did not previously implement `killTask`,
resulting in an exception.

Author: Brenden Matthews <bren...@diddyinc.com>

Closes #2453 from brndnmtthws/implement-killtask and squashes the following 
commits:

23ddcdc [Brenden Matthews] [SPARK-3597][Mesos] Implement `killTask`.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/32fad423
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/32fad423
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/32fad423

Branch: refs/heads/master
Commit: 32fad4233f353814496c84e15ba64326730b7ae7
Parents: cf1d32e
Author: Brenden Matthews <bren...@diddyinc.com>
Authored: Sun Oct 5 09:49:24 2014 -0700
Committer: Andrew Or <andrewo...@gmail.com>
Committed: Sun Oct 5 09:49:24 2014 -0700

----------------------------------------------------------------------
 .../spark/scheduler/cluster/mesos/MesosSchedulerBackend.scala | 7 +++++++
 1 file changed, 7 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/32fad423/core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackend.scala
----------------------------------------------------------------------
diff --git 
a/core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackend.scala
 
b/core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackend.scala
index b117863..e0f2fd6 100644
--- 
a/core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackend.scala
+++ 
b/core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackend.scala
@@ -372,6 +372,13 @@ private[spark] class MesosSchedulerBackend(
     recordSlaveLost(d, slaveId, ExecutorExited(status))
   }
 
+  override def killTask(taskId: Long, executorId: String, interruptThread: 
Boolean): Unit = {
+    driver.killTask(
+      TaskID.newBuilder()
+        .setValue(taskId.toString).build()
+    )
+  }
+
   // TODO: query Mesos for number of cores
   override def defaultParallelism() = 
sc.conf.getInt("spark.default.parallelism", 8)
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to