juliuszsompolski commented on code in PR #41440:
URL: https://github.com/apache/spark/pull/41440#discussion_r1234929783


##########
core/src/main/scala/org/apache/spark/SparkContext.scala:
##########
@@ -829,6 +829,55 @@ class SparkContext(config: SparkConf) extends Logging {
     setLocalProperty(SparkContext.SPARK_JOB_INTERRUPT_ON_CANCEL, null)
   }
 
+  /**
+   * Set the behavior of job cancellation from jobs started in this thread.
+   *
+   * @param interruptOnCancel If true, then job cancellation will result in 
`Thread.interrupt()`
+   * being called on the job's executor threads. This is useful to help ensure 
that the tasks
+   * are actually stopped in a timely manner, but is off by default due to 
HDFS-1208, where HDFS

Review Comment:
   The HDFS-1208 bug is still open... but multiple places in core of Spark has 
by now elected to just pass `true` here, so it likely doesn't make sense for 
the user to set it to `false`, as these places would generate interrupts 
anyway...



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to