mridulm edited a comment on pull request #34098:
URL: https://github.com/apache/spark/pull/34098#issuecomment-928579930
> I've checked SparkEnv.stop and I didn't find obvious unchecked exceptions
that would break the stop(). I think it may not be necessary to do the same
try..catch to SparkEnv.stop
We have the following which can throw exceptions:
* `mapOutputTracker.stop()` can throw SparkException in case of timeout.
* `blockManager.stop()` can throw `InterruptedException`.
* `metricsSystem.stop()` could throw exception - depends on the sink.
* `rpcEnv.shutdown()` could throw `InterruptedException` (and others ?).
* `rpcEnv.awaitTermination` could throw `InterruptedException`.
Note that `sparkEnv.stop` itself is protected - and so would not cause sc
stop to be blocked.
Pls check if any of the above would prevent cleanup - and so reinit of a new
sc to fail/be problematic.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]