Github user tgravescs commented on the pull request:

    https://github.com/apache/spark/pull/12571#issuecomment-215089176
  
    I understand that it can be overridden but I'm not sure we should be in the 
business of setting the GC flags for people.  You set them to one value now, 
someone comes around and wants another default or wants to add another GC flag, 
you have to handle different java vendors and versions and it just becomes a 
maintenance headache.
    
    Also as a user now if I wanted the defaults that came with my java version 
I have to explicitly override these (which I don't know we set since they 
aren't documented).  
    
    Now I realize Spark is setting -XX:MaxPermSize but I would also argue there 
we shouldn't be.  As far as I know that is only around for legacy reasons 
because Spark was originally doing it.  @vanzin correct me if you know 
otherwise.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to