Github user adamstatdna commented on the issue:
https://github.com/apache/spark/pull/16743
Yes, I believe it is still active. The solution stated by Marcelo of
detecting the exit code in local mode would be a solution for my purposes
of testing where you want to do end-to-end
Github user adamstatdna commented on the issue:
https://github.com/apache/spark/pull/16743
My use case is end-to-end automated testing in local mode using
programmatic Launcher. I have tests where the Spark app is expected to be
FINISHED and those where it is expected to be FAILED
Github user adamstatdna commented on the issue:
https://github.com/apache/spark/pull/16743
@zsxwing What's the best way to callback and update the SparkAppHandle
state upon FAILURE or FINISHED state of the spark application? It currently
lacks handling FAILURE.
---
If your project
Github user adamstatdna commented on the issue:
https://github.com/apache/spark/pull/16743
@thomastechs It looks like it would be more appropriate to call private
method, stop(SparkAppHandle.State) so that ensures closure (in addition to the
callback) of the backend rather than just