Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/6625#discussion_r31776659
  
    --- Diff: 
core/src/main/scala/org/apache/spark/serializer/SerializationDebugger.scala ---
    @@ -145,17 +151,50 @@ private[spark] object SerializationDebugger extends 
Logging {
           // An object contains multiple slots in serialization.
           // Get the slots and visit fields in all of them.
           val (finalObj, desc) = findObjectAndDescriptor(o)
    +
    +      // If the object has been replaced using writeReplace(),
    +      // then call visit() on it again to test its type again.
    +      if (!finalObj.eq(o)) {
    +        return visit(finalObj, s"writeReplace data (class: 
${finalObj.getClass.getName})" :: stack)
    +      }
    +
    +      // Every class is associated with one or more "slots", each slot is 
related to the parent
    +      // classes of this class. These slots are used by the 
ObjectOutputStream
    +      // serialization code to recursively serialize the fields of an 
object and
    +      // its parent classes. For example, if there are the following 
classes.
    +      //
    +      //     class ParentClass(parentField: Int)
    +      //     class ChildClass(childField: Int) extends ParentClass(1)
    +      //
    +      // Then serializing the an object Obj of type ChildClass requires 
for serializing the fields
    --- End diff --
    
    requires for -> requires first


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to