Github user vanzin commented on the pull request:

    https://github.com/apache/spark/pull/2401#issuecomment-56216271
  
    So, I'm a little disappointed that this doesn't at least follow the Yarn 
model of "one setting that defines the overhead". Instead, it has two settings, 
one for a fraction and one to define some minimum if the fraction is somehow 
less than that. That sounds too complicated.
    
    What's the argument against Yarn's model of a single setting with an 
absolute overhead value? That doesn't require the user to do math, and makes 
things easier when for some reason the user requires lots of overhead (e.g. 
large usage of off-heap memory) that is not necessarily related to the heap 
size.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to