Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/2516#discussion_r17998434
--- Diff:
core/src/main/resources/org/apache/spark/deploy/spark-submit-defaults.prop ---
@@ -0,0 +1,90 @@
+# The master URL for the cluster egspark://host:port, mesos://host:port,
yarn, or local.
+# legacy env variable MASTER
+spark.master = local[*]
+
+# Should spark submit run in verbose mode: default is false
+spark.verbose = false
+
+# Comma-separated list of files to be placed in the working directory of
each executor
+# spark.files =
+
+# Comma-separated list of .zip, .egg, or .py files to place on the
PYTHONPATH for Python apps.
+# spark.submit.pyfiles =
+
+# Comma-separated list of local jars to include on the driver and executor
classpaths.
+# spark.jars =
+
+
+# Path to a bundled jar including your application and all dependencies.
--- End diff --
In general I don't think this is a good place for documentation. This file
would be embedded in a jar file and not one would ever look at it.
Documentation should go in `docs/*`.
On top of that, this particular option doesn't make a lot of sense. It
wasn't an option before, and it was and should continue to be a required
command line argument of spark-submit. The "primary resource" is just the main
file of your application (e.g. the jar or python file). It doesn't make a lot
sense to put this in a config file.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]