AngersZhuuuu commented on a change in pull request #29319:
URL: https://github.com/apache/spark/pull/29319#discussion_r482797232



##########
File path: 
sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/InsertIntoHiveDirCommand.scala
##########
@@ -108,8 +109,11 @@ case class InsertIntoHiveDirCommand(
         outputLocation = tmpPath.toString)
 
       if (overwrite && fs.exists(writeToPath)) {
+        val isTrashEnabled = sparkSession.sessionState.conf.trashEnabled
         fs.listStatus(writeToPath).foreach { existFile =>
-          if (Option(existFile.getPath) != createdTempDir) 
fs.delete(existFile.getPath, true)
+          if (Option(existFile.getPath) != createdTempDir) {
+            Utils.moveToTrashOrDelete(fs, existFile.getPath, isTrashEnabled, 
hadoopConf)
+          }

Review comment:
       I am doing this too.   this pr (such as INSERT OVERWRITE DIRECTION part 
) can avoid our user write wrong  dir path that have data (such as DB's path, 
happened before).
   Move data to trash can make recovery of production data faster in the event 
of such a disaster.
   FYI @maropu 




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to