Github user cloud-fan commented on the issue:

    https://github.com/apache/spark/pull/20490
  
    AFAIK, there is no KILL mechanism for speculative tasks, which means, if 
multiple task attempts are running at the same time and one of them finishes 
earlier, the others will keep running until finished.
    
    But your concern is valid, users can manually cancel a job/stage, and then 
the task thread will be interrupted.
    
    Fortunatelly this case is correctly handled. The writing logic is wrapped 
by `try-catch`. If the task is interrupted, `InterruptedException` will be 
thrown and caught, and `DataWriter.abort` will be called.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to