Github user nishkamravi2 commented on a diff in the pull request:
https://github.com/apache/spark/pull/5504#discussion_r28536990
--- Diff:
launcher/src/main/java/org/apache/spark/launcher/AbstractCommandBuilder.java ---
@@ -186,12 +186,24 @@ void addOptionString(List<String> cmd, String
options) {
addToClassPath(cp, String.format("%s/core/target/jars/*",
sparkHome));
}
- final String assembly =
AbstractCommandBuilder.class.getProtectionDomain().getCodeSource().
- getLocation().getPath();
+ // We can't rely on the ENV_SPARK_ASSEMBLY variable to be set. Certain
situations, such as
+ // when running unit tests, or user code that embeds Spark and creates
a SparkContext
+ // with a local or local-cluster master, will cause this code to be
called from an
+ // environment where that env variable is not guaranteed to exist.
+ //
+ // For the testing case, we rely on the test code to set and propagate
the test classpath
+ // appropriately.
+ //
+ // For the user code case, we fall back to looking for the Spark
assembly under SPARK_HOME.
+ // That duplicates some of the code in the shell scripts that look for
the assembly, though.
+ String assembly = getenv(ENV_SPARK_ASSEMBLY);
+ if (assembly == null && isEmpty(getenv("SPARK_TESTING"))) {
+ assembly = findAssembly();
+ }
--- End diff --
I don't think this check 'ensures' that those tests work, this check
'requires' that those tests work if the assembly is not there (more like an
assert). I don't feel strongly for or against it, but it does seem unnecessary.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]