Github user Ngone51 commented on the issue:
https://github.com/apache/spark/pull/20998
Hi, @felixcheung , thank for trigger a task and your comments.
> shouldn't this be up to the scheduler backend?
Actually, it is `TaskSchedulerImpl` who holds a thread to check whether
there are any speculative tasks need to be scheduled periodically. If any,
then, call `backend.reviveOffers` to offer resources . But, it's
`TaskSetManager` who decides whether we need to launch a speculative task for a
certain task.
> multiple tasks/attempts can run simultaneously on the same physical host?
I think multiple task attempts(actually, speculative tasks) can run on the
sam physical host, but not simultaneously, as long as there's no running
attempt on it. In PR description, I illustrate a case which a speculative task
chose to run on a host, where a previous task attempts have been run on, but
failed finally. I think if the task's failure is not relevant to the host, 'run
on the same host' can be acceptable.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]