GitHub user vanzin opened a pull request:

    https://github.com/apache/spark/pull/560

    [SPARK-1395] Fix "local:" URI support in Yarn mode (again).

    Just update some calls to call methods that properly
    handle "local:" URIs. Aside from those, two changes:
    
    - the SPARK_JAR env variable needs to be propagated to
      remote processes when the spark jar is "local:",
      otherwise those processes won't be able to find it
      in their PWD (since the file won't be distributed)
    
    - I removed the hacky way that log4j configuration was
      being propagated to handle the "local:" case. It's
      much more cleanly (and generically) handled by using
      spark-submit arguments (--files to upload a config
      file, or setting spark.executor.extraJavaOptions to
      pass JVM arguments and use a local file).

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/vanzin/spark yarn-local-2

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/560.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #560
    
----
commit ffa1b55ddf5164a752b88635ad5095c53874284f
Author: Marcelo Vanzin <[email protected]>
Date:   2014-04-25T18:07:28Z

    [SPARK-1395] Fix "local:" URI support in Yarn mode (again).
    
    Just update some calls to call methods that properly
    handle "local:" URIs. Aside from those, two changes:
    
    - the SPARK_JAR env variable needs to be propagated to
      remote processes when the spark jar is "local:",
      otherwise those processes won't be able to find it
      in their PWD (since the file won't be distributed)
    
    - I removed the hacky way that log4j configuration was
      being propagated to handle the "local:" case. It's
      much more cleanly (and generically) handled by using
      spark-submit arguments (--files to upload a config
      file, or setting spark.executor.extraJavaOptions to
      pass JVM arguments and use a local file).

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

Reply via email to