arunmahadevan commented on a change in pull request #23912: [SPARK-21029][SS] 
StreamingQuery should be stopped when the SparkSession is stopped
URL: https://github.com/apache/spark/pull/23912#discussion_r263506634
 
 

 ##########
 File path: core/src/main/scala/org/apache/spark/SparkContext.scala
 ##########
 @@ -83,6 +84,7 @@ class SparkContext(config: SparkConf) extends Logging {
 
   val startTime = System.currentTimeMillis()
 
+  private[spark] val stopping: AtomicBoolean = new AtomicBoolean(false)
 
 Review comment:
   This is to prevent multiple concurrent attempts to stop and ensure that the 
stop (and the hooks) are executed only once (similar to the `stopped` 
variable). But we cannot use the `stopped` variable because this is used to 
determine the spark context stopped status. So if thats set when the hooks are 
run, the query stop fails with exceptions since there are checks to make sure 
the spark context is alive before the query can stop.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to