xupefei commented on code in PR #47815:
URL: https://github.com/apache/spark/pull/47815#discussion_r1730962280


##########
core/src/main/scala/org/apache/spark/SparkContext.scala:
##########
@@ -2708,13 +2741,51 @@ class SparkContext(config: SparkConf) extends Logging {
   def cancelJobsWithTag(tag: String): Unit = {
     SparkContext.throwIfInvalidTag(tag)
     assertNotStopped()
-    dagScheduler.cancelJobsWithTag(tag, None)
+    dagScheduler.cancelJobsWithTag(
+      tag,
+      reason = None,
+      shouldCancelJob = None,
+      cancelledJobs = None)
+  }
+
+  /**
+   * Cancel all jobs that have been scheduled or are running.
+   *
+   * @param shouldCancelJob Callback function to be called with the job ID of 
each job that matches
+   *    the given tag. If the function returns true, the job will be cancelled.
+   * @return A future that will be completed with the set of job IDs that were 
cancelled.
+   */
+  def cancelAllJobs(shouldCancelJob: ActiveJob => Boolean): Future[Set[Int]] = 
{

Review Comment:
   I think I can just make these ones private. They're added here to be used 
solely by the new APIs in SparkSession.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to