Github user nishkamravi2 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/5504#discussion_r28488106
  
    --- Diff: 
launcher/src/main/java/org/apache/spark/launcher/AbstractCommandBuilder.java ---
    @@ -186,12 +186,24 @@ void addOptionString(List<String> cmd, String 
options) {
           addToClassPath(cp, String.format("%s/core/target/jars/*", 
sparkHome));
         }
     
    -    final String assembly = 
AbstractCommandBuilder.class.getProtectionDomain().getCodeSource().
    -   getLocation().getPath();
    +    // We can't rely on the ENV_SPARK_ASSEMBLY variable to be set. Certain 
situations, such as
    +    // when running unit tests, or user code that embeds Spark and creates 
a SparkContext
    +    // with a local or local-cluster master, will cause this code to be 
called from an
    +    // environment where that env variable is not guaranteed to exist.
    +    //
    +    // For the testing case, we rely on the test code to set and propagate 
the test classpath
    +    // appropriately.
    +    //
    +    // For the user code case, we fall back to looking for the Spark 
assembly under SPARK_HOME.
    +    // That duplicates some of the code in the shell scripts that look for 
the assembly, though.
    +    String assembly = getenv(ENV_SPARK_ASSEMBLY);
    +    if (assembly == null && isEmpty(getenv("SPARK_TESTING"))) {
    +      assembly = findAssembly();
    +    }
    --- End diff --
    
    if (assembly == null) findAssembly() ?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to