Github user pwendell commented on a diff in the pull request:
https://github.com/apache/spark/pull/1845#discussion_r16397360
--- Diff: bin/spark-class ---
@@ -146,10 +149,28 @@ if $cygwin; then
fi
export CLASSPATH
-if [ "$SPARK_PRINT_LAUNCH_COMMAND" == "1" ]; then
+if [ -n "$SPARK_PRINT_LAUNCH_COMMAND" ]; then
echo -n "Spark Command: " 1>&2
echo "$RUNNER" -cp "$CLASSPATH" $JAVA_OPTS "$@" 1>&2
echo -e "========================================\n" 1>&2
fi
-exec "$RUNNER" -cp "$CLASSPATH" $JAVA_OPTS "$@"
+# In Spark submit client mode, the driver is launched in the same JVM as
Spark submit itself.
+# Here we must parse the properties file for relevant "spark.driver.*"
configs for launching
+# the driver JVM itself.
+
+if [ -n "$SPARK_SUBMIT_CLIENT_MODE" ]; then
+ # This is currently used only if the properties file actually consists
of these special configs
--- End diff --
There is no doc anywhere here that explains why this `SparkClassLauncher`
is needed - notably to help bootstrap setting certain java options that we
don't want to parse in bash. It be good to say this here or somewhere in a
comment in `SparkClassLauncher`.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]