GitHub user ScrapCodes opened a pull request:

    https://github.com/apache/spark/pull/17357

    [SPARK-20025][CORE] Fix spark's driver failover mechanism.

    ## What changes were proposed in this pull request?
    
    In a bare metal system with No DNS setup, spark may be configured with 
SPARK_LOCAL* for IP and host properties.
    During a driver failover, in cluster deployment mode. SPARK_LOCAL* should 
be ignored while auto deploying and should be picked up from target system's 
local environment.
    
    
    ## How was this patch tested?
    
    Distributed deployment against a cluster of 3 Nodes and 6 Workers. Tested 
by killing JVM's running driver and verified the restarted JVMs have right 
configurations on them.

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/ScrapCodes/spark driver-failover-fix

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/17357.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #17357
    
----
commit dc9cd31ce20dbf5fad28a031b0989084ca671f32
Author: Prashant Sharma <prash...@in.ibm.com>
Date:   2017-03-20T05:38:37Z

    [CORE] Fix spark's driver failover mechanism.

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to