Github user rdblue commented on the pull request:

    https://github.com/apache/spark/pull/11528#issuecomment-192544174
  
    @andrewor14, thanks for taking a look at this so quickly.
    
    From what we've seen, what you suggest isn't a viable work-around in 
practice because the two properties are set at different times and aren't 
obviously related. We default jobs to dynamic allocation, so that's an admin 
configuration most users don't see (unless they want to opt in for static). 
Users set up their applications and it isn't clear to at that time that 
requesting some number of executors results in static allocation. They also 
don't necessarily see the notification in their job logs.
    
    I think a reasonable interpretation is that --num-executors would control 
the minimum or the initial number of executors. It's fine with me that the 
default goes the other way, but we lose a lot of resources to accidental static 
allocation. I think this additional option is worth the trade-off, but I'm 
happy to discuss other approaches. The main problem with your suggestion is 
just that users aren't aware that the two are related.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to