Github user tgravescs commented on a diff in the pull request:

    https://github.com/apache/spark/pull/299#discussion_r11775108
  
    --- Diff: docs/configuration.md ---
    @@ -649,6 +652,34 @@ Apart from these, the following properties are also 
available, and may be useful
         Number of cores to allocate for each task.
       </td>
     </tr>
    +<tr>
    +  <td>spark.executor.extraJavaOptions</td>
    +  <td>(none)</td>
    +  <td>
    +    A string of extra JVM options to pass to executors. For instance, GC 
settings or other
    +    logging. Note that it is illegal to set Spark properties or heap size 
settings with this 
    +    option. Spark properties should be set using a SparkConf object or the 
    +    spark-defaults.conf file used with the spark-submit script. Heap size 
settings can be set
    --- End diff --
    
    should this be spark-defaults.properties since that is wait code looks for?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

Reply via email to