LuciferYang commented on code in PR #50489:
URL: https://github.com/apache/spark/pull/50489#discussion_r2032314865
##########
core/src/main/scala/org/apache/spark/serializer/SerializationDebugger.scala:
##########
@@ -111,7 +112,12 @@ private[spark] object SerializationDebugger extends
Logging {
visitExternalizable(e, elem :: stack)
case s: Object with java.io.Serializable =>
- val elem = s"object (class ${s.getClass.getName}, $s)"
+ val str = try {
+ String.valueOf(s)
+ } catch {
+ case _: SparkRuntimeException => "cannot print object"
+ }
+ val elem = s"object (class ${s.getClass.getName}, $str)"
Review Comment:
can we just add a branch like `case s: Object with java.io.Serializable if
Utils.isTesting =>`
##########
core/src/test/scala/org/apache/spark/serializer/SerializationDebuggerSuite.scala:
##########
@@ -205,6 +214,12 @@ class SerializableClass1 extends Serializable
class SerializableClass2(val objectField: Object) extends Serializable
+class SerializableClassWithStringException(val objectField: Object) extends
Serializable {
+ override def toString: String = {
Review Comment:
https://github.com/apache/spark/blob/554d67817e44498cca9d1a211d8bdc4a69dc9d0e/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala#L228-L229
So is this an issue that can only occur during testing?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]