dongjoon-hyun commented on pull request #29231:
URL: https://github.com/apache/spark/pull/29231#issuecomment-663868819


   I tried to remove all constraints; 1) public `no-op constructor`, 2) public 
constructor, 3) public class. But, it turns out that it cannot resolve this 
issue.
   ```scala
   -private[spark] class HighlyCompressedMapStatus private (
   +class HighlyCompressedMapStatus(
        private[this] var loc: BlockManagerId,
        private[this] var numNonEmptyBlocks: Int,
        private[this] var emptyBlocks: RoaringBitmap,
   @@ -181,7 +181,7 @@ private[spark] class HighlyCompressedMapStatus private (
        || numNonEmptyBlocks == 0 || _mapTaskId > 0,
        "Average size can only be zero for map stages that produced no output")
   
   -  protected def this() = this(null, -1, null, -1, null, -1)  // For 
deserialization only
   +  def this() = this(null, -1, null, -1, null, -1)  // For deserialization 
only
   
      override def location: BlockManagerId = loc
   
   @@ -217,7 +217,7 @@ private[spark] class HighlyCompressedMapStatus private (
   
      override def readExternal(in: ObjectInput): Unit = Utils.tryOrIOException 
{
        loc = BlockManagerId(in)
   -    numNonEmptyBlocks = -1 // SPARK-32436 Scala 2.13 doesn't initialize 
this during deserialization
   +    // numNonEmptyBlocks = -1 // SPARK-32436 Scala 2.13 doesn't initialize 
this during deserialization
   ```


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to