GitHub user sitalkedia opened a pull request:

    https://github.com/apache/spark/pull/19580

    [SPARK-22312][CORE] Fix bug in Executor allocation manager in running tasks 
calculation

    ## What changes were proposed in this pull request?
    
    We often see the issue of Spark jobs stuck because the Executor Allocation 
Manager does not ask for any executor even if there are pending tasks in case 
dynamic allocation is turned on. Looking at the logic in Executor Allocation 
Manager, which calculates the running tasks, it can happen that the calculation 
will be wrong and the number of running tasks can become negative.
    
    ## How was this patch tested?
    
    Added unit test


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/sitalkedia/spark skedia/fix_stuck_job

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/19580.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #19580
    
----
commit f8fcc3560e087440c7618b33cc892f3feafd4a3a
Author: Sital Kedia <[email protected]>
Date:   2017-10-19T05:24:38Z

    [SPARK-22312][CORE] Fix bug in Executor allocation manager in running tasks 
calculation

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to