Github user zsxwing commented on the pull request:

    https://github.com/apache/spark/pull/3378#issuecomment-63769414
  
    It's weird. I just found both the sizes of old and new `CompactBuffer(1)` 
are 56. I cannot explain why.
    
    Then I added a field to the old CompactBuffer like this:
    ```Scala
    class CompactBuffer[T] extends Seq[T] with Serializable {
      val dummy: AnyRef = null
    
      // First two elements
      private var element0: T = _
      private var element1: T = _
    ```
    `println(estimateSize(CompactBuffer[Int](1)))` also outputs `56`.
    



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to