ajithme commented on issue #24142: [SPARK-27194][core] Job failures when task 
attempts do not clean up spark-staging parquet files
URL: https://github.com/apache/spark/pull/24142#issuecomment-474980913
 
 
   > There might be edge cases where the failed task may still have the file 
opened and deleting from another place may not work as expected.
   but wouldn't in this case as executor has exited, the lease will expire on 
the remote file letting retry task to delete successfully.? ( behaviour may 
differ if filesystem is not hdfs.?) 
   
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to