Github user tnachen commented on the pull request:

    https://github.com/apache/spark/pull/3393#issuecomment-64098126
  
    For 3, you are right if the offer is not used it is not acked.
    My assumption was that all offers that we accepted based on the conditions 
including the ones expanding existing executors are used.
    If that is not correct than we need to do what you suggested to have a 
local map to see what offers are used by the task scheduler impl.
    I need to dig a bit deeper into the spark scheduler more.
    @jongyoul i think if @pwendell condition can be true your fix needs to 
account that as well, sorry for the confusion


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to