dongjoon-hyun edited a comment on pull request #32983:
URL: https://github.com/apache/spark/pull/32983#issuecomment-864491601


   Thank you for review, @HyukjinKwon and @sunchao , and @viirya .
   - The currently AS-IS master branch UTs are failing with this when it 
happens.
   - This PR aims to ignore the Hadoop exception if that is due to the same 
file copy. If src=dest, there is no regression here.
   
   Yes, both skipping in Spark side or overwriting by Spark-side are possible. 
For this PR, I just didn't want to change the previous Spark logic (invoking 
Hadoop) and to fix the UT failure.
   > Instead of catching the exception, Maybe we can just check if src and dst 
are the same, and if so skip the copying?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to