Deegue commented on issue #26541: [SPARK-29910][SQL] Optimize speculation 
performance by adding minimum runtime limit
URL: https://github.com/apache/spark/pull/26541#issuecomment-556901420
 
 
   Thanks for your review @cloud-fan . 
   Before this patch, the minimum runtime is set to 100ms which means almost 
all the tasks meet the condition will be speculated. After this patch and 
setting it to 60s, a lot of tasks finished in seconds won't be speculated and 
our cluster is in better performance. Thus I think it's useful in this case.
   
   As for jobs and stages, `spark.speculation.quantile` and 
`spark.speculation.multiplier` will help to judge which task need to be 
speculated in different situations.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to