holdenk commented on code in PR #33934:
URL: https://github.com/apache/spark/pull/33934#discussion_r1599043338
##########
common/utils/src/main/scala/org/apache/spark/SparkException.scala:
##########
@@ -145,8 +148,10 @@ private[spark] class SparkDriverExecutionException(cause:
Throwable)
* Exception thrown when the main user code is run as a child process (e.g.
pyspark) and we want
* the parent SparkSubmit process to exit with the same exit code.
*/
-private[spark] case class SparkUserAppException(exitCode: Int)
- extends SparkException(s"User application exited with $exitCode")
+private[spark] case class SparkUserAppException(exitCode: Int, errorMsg:
Option[String] = None)
+ extends SparkException(
+ log"User application exited with ${MDC(LogKeys.EXIT_CODE,
exitCode)}".message +
+ errorMsg.map(error => s" and caused by\n$error").getOrElse(""))
Review Comment:
I'm not super familiar with MDC this is going to use a generic exit code
mapping to determine what went wrong from the exit code?
##########
core/src/main/scala/org/apache/spark/deploy/PythonRunner.scala:
##########
@@ -92,15 +92,16 @@ object PythonRunner {
// see https://github.com/numpy/numpy/issues/10455
sparkConf.getOption("spark.driver.cores").foreach(env.put("OMP_NUM_THREADS", _))
}
- builder.redirectErrorStream(true) // Ugly but needed for stdout and stderr
to synchronize
Review Comment:
Why is this not needed anymore?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]