Github user mateiz commented on the pull request:

    https://github.com/apache/spark/pull/3524#issuecomment-65329936
  
    BTW while it's true that it will ask for more than twice myMemoryThreshold, 
that's actually intentional. At the beginning, when a collection ramps up, its 
myMemoryThreshold is 0 until it has at least 1000 elements. Then it will need 
to request something reasonable given its usage so far, which will be more than 
2 * 0. After this point, hopefully the objects added are small and when you get 
currentMemory > myMemoryThreshold there will just be a small gap between them. 
But the way the code is written it makes sure to get twice the current memory 
*usage* (not the current limit) in both cases.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to