Github user andrewor14 commented on a diff in the pull request:
https://github.com/apache/spark/pull/299#discussion_r11720017
--- Diff: docs/configuration.md ---
@@ -643,6 +646,34 @@ Apart from these, the following properties are also
available, and may be useful
Number of cores to allocate for each task.
</td>
</tr>
+<tr>
+ <td>spark.executor.extraJavaOptions</td>
+ <td>(none)</td>
+ <td>
+ A string of extra JVM options to pass to executors. For instance, GC
settings or other
+ logging. Note that it is illegal to set Spark properties or heap size
settings with this
+ option. Spark properties should be set using a SparkConf object or the
+ spark-defaults.conf file used with the spark-submit script. Heap size
settings can be set
+ with spark.executor.memory.
+ </td>
+</tr>
+<tr>
+ <td>spark.executor.extraClassPath</td>
+ <td>(none)</td>
+ <td>
+ Extra classpath entries to append to the classpath of executors. This
exists primarily
+ for backwards-compatiblity with older versions of Spark. Users
typically should not need
--- End diff --
compatibility
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---