mridulm commented on code in PR #50573:
URL: https://github.com/apache/spark/pull/50573#discussion_r2049076181
##########
core/src/main/scala/org/apache/spark/serializer/SerializationDebugger.scala:
##########
@@ -112,11 +112,11 @@ private[spark] object SerializationDebugger extends
Logging {
val elem = s"externalizable object (class ${e.getClass.getName},
$e)"
visitExternalizable(e, elem :: stack)
- case s: Object with java.io.Serializable if Utils.isTesting =>
+ case s: Object with java.io.Serializable =>
val str = try {
s.toString
} catch {
- case _: SparkRuntimeException => "exception in toString"
+ case _: SparkRuntimeException if Utils.isTesting => "exception
in toString"
Review Comment:
NPE I was referring to is what is getting thrown as part of executing
`toString` : what we addressed in #50402
The current reason for NPE might be (2), as you indicated - and so scoped to
testing currently.
But this can happen for any other reason too (including bugs in user udf's
for example).
The diff I am proposing is, replace `case _: SparkRuntimeException if
Utils.isTesting` with `case _: Exception`
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]