GitHub user skyluc opened a pull request:

    https://github.com/apache/spark/pull/10329

    [SPARK-12345] [CORE] Do not send SPARK_HOME through Spark submit REST 
interface

    It is usually an invalid location on the remote machine executing the job.
    It is picked up by the Mesos support in cluster mode, and most of the time 
causes
    the job to fail.
    
    Fixes SPARK-12345

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/skyluc/spark issue/SPARK_HOME

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/10329.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #10329
    
----
commit f75815b9c4669b2871029cb35f8c95960d6fd54f
Author: Luc Bourlier <[email protected]>
Date:   2015-12-16T11:17:30Z

    Do not send SPARK_HOME through Spark submit REST interface
    
    It is usually an invalid location on the remote machine executing the job.
    It is picked up by the Mesos support in cluster mode, and most of the time 
causes
    the job to fail.
    
    Fixes SPARK-12345

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to