GitHub user sitalkedia opened a pull request:

    https://github.com/apache/spark/pull/12436

    [SPARK-14649][CORE] DagScheduler should not run duplicate tasks on fe…

    ## What changes were proposed in this pull request?
    
    Currently in case of fetch failure, the dag scheduler reruns all the 
pending tasks for the failed phase even if some tasks are already running. This 
creates a situation where many duplicate tasks are running on the cluster.
    
    ## How was this patch tested?
    
    Added a new test case for it and made sure the test case failed without the 
change.

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/sitalkedia/spark avoid_duplicate_tasks

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/12436.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #12436
    
----
commit 3c77e69345b3ef82b6d4a07e202a836ec75c153e
Author: Sital Kedia <[email protected]>
Date:   2016-04-15T23:44:23Z

    [SPARK-14649][CORE] DagScheduler should not run duplicate tasks on fetch 
failure

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to