Hi Spark Users, I’m running Spark jobs on Mesos, and sometimes I get vast number of Task Scheduler Errors:
ERROR TaskSchedulerImpl: Ignoring update with state FINISHED for TID 1161 because its task set is gone (this is likely the result of receiving duplicate task finished status updates)T It looks just a warning message despite begins with ERROR, and do no harm the eventual result, but I don’t know if these indicates tasks run slower than usual. ------------------------------ BR, Todd Leo