robreeves commented on code in PR #53657:
URL: https://github.com/apache/spark/pull/53657#discussion_r2679233885
##########
core/src/main/scala/org/apache/spark/ui/jobs/AllJobsPage.scala:
##########
@@ -342,6 +343,16 @@ private[ui] class AllJobsPage(parent: JobsTab, store:
AppStatusStore) extends We
}
}
</li>
+ {
+ exitCode match {
+ case Some(code) if code != 0 =>
+ <li>
+ <strong>Final Status:</strong>
+ {s"Failure (exit code: $code)"}
Review Comment:
I did that originally, but found there are some cases where the application
doesn't succeed but keeps the application as 0. For example, a python app that
throws an exception in the driver and runs in local mode. I was worried this is
misleading and think we need to improve how exit codes are set before showing
success.
For example this script:
```
from pyspark.sql import SparkSession
spark = SparkSession.builder \
.appName("CrashDriver") \
.getOrCreate()
# Create a DataFrame
data = [("Alice", 1), ("Bob", 2)]
df = spark.createDataFrame(data, ["name", "id"])
# Force a crash by raising an unhandled exception
raise RuntimeError("Intentional driver crash!")
spark.stop()
```
Here is the SparkListenerApplicationEnd event:
```json
{"Event":"SparkListenerApplicationEnd","Timestamp":1768108673350,"ExitCode":0}
```
##########
core/src/main/scala/org/apache/spark/ui/jobs/AllJobsPage.scala:
##########
@@ -342,6 +343,16 @@ private[ui] class AllJobsPage(parent: JobsTab, store:
AppStatusStore) extends We
}
}
</li>
+ {
+ exitCode match {
+ case Some(code) if code != 0 =>
+ <li>
+ <strong>Final Status:</strong>
+ {s"Failure (exit code: $code)"}
Review Comment:
I did that originally, but found there are some cases where the application
doesn't succeed but keeps the application as 0. For example, a python app that
throws an exception in the driver and runs in local mode. I was worried this is
misleading and think we need to improve how exit codes are set before showing
success.
For example this script:
```python
from pyspark.sql import SparkSession
spark = SparkSession.builder \
.appName("CrashDriver") \
.getOrCreate()
# Create a DataFrame
data = [("Alice", 1), ("Bob", 2)]
df = spark.createDataFrame(data, ["name", "id"])
# Force a crash by raising an unhandled exception
raise RuntimeError("Intentional driver crash!")
spark.stop()
```
Here is the SparkListenerApplicationEnd event:
```json
{"Event":"SparkListenerApplicationEnd","Timestamp":1768108673350,"ExitCode":0}
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]