Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/15005#discussion_r78353885
  
    --- Diff: docs/building-spark.md ---
    @@ -16,24 +16,27 @@ Building Spark using Maven requires Maven 3.3.9 or 
newer and Java 7+.
     
     ### Setting up Maven's Memory Usage
     
    -You'll need to configure Maven to use more memory than usual by setting 
`MAVEN_OPTS`. We recommend the following settings:
    +If you are compiling with Java 7, you'll need to configure Maven to use 
more memory than usual by setting `MAVEN_OPTS`:
    --- End diff --
    
    I actually think the ReservedCodeCacheSize advice is defunct. I don't use 
it and don't have problems. But, maybe not worth worrying about here. It has 
nothing to do with the warnings discussed in this section though.
    
    Anyway, might I finally suggest that, because Java 8 is more the default 
now and soon to be required, that this advice basically say:
    
    Add Xmx and XX:ReservedCodeCacheSize to MAVEN_OPTS
    Or else you might see this error
    If you're on Java 7 by the way, also add this MaxPermSize param like so


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to