Ngone51 commented on issue #27223: [SPARK-30511][CORE] Spark marks intentionally killed speculative tasks as pending leads to holding idle executors URL: https://github.com/apache/spark/pull/27223#issuecomment-575445148 Seem likes the same problem also exists for normal task when speculative task finished before normal task? Is it possible to check whether there's another task attempt has succeed when we receive a failed taskEnd event. e.g. ask for TaskSchedulerImpl/TaskSetManager or just record those successful tasks in `ExecutorAllocationManager`.
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
