Github user mwws commented on the pull request:

    https://github.com/apache/spark/pull/8760#issuecomment-161165662
  
    @kayousterhout I believe in most of case user just can just use default 
configuration or set `spark.scheduler.blacklist.strategy` to `executorAndNode`. 
I also like idea about minimizing user configuration, but at the same time, it 
would better to provide extendability and flexibility in case the "smart" 
system can not satisfy all user requirements.
    
    About "running a speculative copy of a task", it might be doable but as I 
understand it add much complex to maintain correct status of TaskSet/Stage 
which the "speculative task" belows to. What if all other tasks has finished 
and the "speculative task" is still running? Should the speculative task be 
retried as normal tasks? What if "speculative task" succeeded but original task 
failed? Should next stage be aware of these "speculative tasks"? etc. And from 
system monitoring point of view, additional duplicated tasks are created which 
may confuse user.
    



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to