Ngone51 commented on PR #39918:
URL: https://github.com/apache/spark/pull/39918#issuecomment-1420450697

   I doubt we use the DEBUG level in this case. The corruption cause here can 
only be either the disk issue or the network issue right now. And both of them 
could be temporary (problematic disk could be persistent but spark doesn't 
guarantee writing files on the same disk partition each time) or difficult to 
reproduce. So I'm afraid using the DEBUG level could miss the cause easily in 
the first place.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to