Github user nishkamravi2 commented on the pull request:

    https://github.com/apache/spark/pull/1391#issuecomment-48836220
  
    Mridul, I think you are missing the point. We understand that this 
parameter will in a lot of cases have to be specified by the developer, since 
there is no easy way to model it (that's why we are retaining it as a 
configurable parameter). However, the question is what would be a good default 
value be. 
    
    "I would like a good default estimate of overhead ... But that is not
    fraction of executor memory. "
    
    You are mistaken. It may not be a directly correlated variable, but it is 
most certainly indirectly correlated. And it is probably correlated to other 
app-specific parameters as well. 
    
    "Until the magic explanatory variable is found, which one is less 
problematic for end users -- a flat constant that frequently has to be tuned, 
or an imperfect model that could get it right in more cases?"
    
    This is the right point of view.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

Reply via email to