Github user sryza commented on the pull request:

    https://github.com/apache/spark/pull/3525#issuecomment-65021562
  
    In Spark 1.2, the memory overhead defaults to 7% of the executor memory.  
Have you noticed that you need larger than this fraction?  In the change that 
added that fraction, there was some concern about having two different params 
(a constant overhead and a fraction) to control the same value.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to