Github user kayousterhout commented on the pull request:

    https://github.com/apache/spark/pull/9154#issuecomment-149106963
  
    @vanzin my understanding is that addPendingTask looks through each 
preferred location for that task, and adds the task to the lists for (1) the 
list of executors corresponding to that location, (2) the list of hosts 
corresponding to that location and (3) the list of racks corresponding to that 
location.  For tasks that are not yet running, it seems like this calling this 
in executorLost should have no effect: the only difference from the previous 
time that addPendingTask was called (before the executor was lost) is that, 
before the executor was lost, the task would have been added to one additional 
list for the lost executor.
    
    What do you mean about tasks being in the wrong list?  I see that there 
will be an entry (with a list of pending tasks) in pendingTaskForExecutor 
corresponding to an executor that is dead, but calling "addPendingTasks" never 
removes anything from any mappings, so that call doesn't fix that problem.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to