Github user srowen commented on the pull request:

    https://github.com/apache/spark/pull/5536#issuecomment-94736176
  
    Trying to synthesize the several comments here: obviously we want to avoid 
the exception reported in the JIRA. Sandy you say that it's OK for pending 
executors to be -10, but then does that mean that whatever is making the 
request with value "-10" now needs logic to never request anything < 0 instead? 
    
    That, is are you suggesting that the capping should not actually change the 
value from -10? Does the value stay at -10 indefinitely then? that's what the 
new charts on the JIRA suggest. I suppose there is no allocation to "fulfill" 
that request for -10 pending executors and decrease it by -10. 
    
    Is the issue that the shut down of the executors should update the pending 
count too? -10 pending executors means "waiting to see 10 executors stop".


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to