GitHub user tanyatik opened a pull request:

    https://github.com/apache/spark/pull/2062

    [SPARK-3150] Fix NullPointerException in in Spark recovery: Add 
initializing default values in DriverInfo.init()

    The issue happens when Spark is run standalone on a cluster.
    When master and driver fall simultaneously on one node in a cluster, master 
tries to recover its state and restart spark driver.
    While restarting driver, it falls with NPE exception (stacktrace is below).
    After falling, it restarts and tries to recover its state and restart Spark 
driver again. It happens over and over in an infinite cycle.
    Namely, Spark tries to read DriverInfo state from zookeeper, but after 
reading it happens to be null in DriverInfo.worker.
    
    
https://issues.apache.org/jira/issues/?jql=project%20%3D%20SPARK%20AND%20created%3E%3D-1w%20ORDER%20BY%20created%20DESC

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/tanyatik/spark spark-3150

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/2062.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #2062
    
----
commit 9936043f26937887a23aa01d340c34ac71a51673
Author: Tatiana Borisova <[email protected]>
Date:   2014-08-20T18:17:45Z

    Add initializing default values in DriverInfo.init()

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to