pan3793 commented on code in PR #52770:
URL: https://github.com/apache/spark/pull/52770#discussion_r2476446287
##########
core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala:
##########
@@ -1166,7 +1166,7 @@ object SparkSubmit extends CommandLineUtils with Logging {
super.doSubmit(args)
} catch {
case e: SparkUserAppException =>
- exitFn(e.exitCode, Some(e))
+ exitFn(e.exitCode, Option(e.getCause))
Review Comment:
@vrozov According to the ScalaDoc, `SparkUserAppException` is used to
propagate the exit code (and later, `cause` was added but not actually used in
the current Spark codebase) of the subprocess, so it's always a noisy thing if
we print its stacktrace for PySpark, or SparkR apps.
I personally think that such a message is clear enough.
```scala
/**
* Exception thrown when the main user code is run as a child process (e.g.
pyspark) and we want
* the parent SparkSubmit process to exit with the same exit code.
*/
private[spark] case class SparkUserAppException(exitCode: Int, cause:
Throwable = null)
extends SparkException(s"User application exited with $exitCode", cause)
```
BTW, the current change just brings the behavior back to 4.0, more
precisely, before SPARK-53620
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]