Github user gatorsmile commented on a diff in the pull request:

    https://github.com/apache/spark/pull/17858#discussion_r115879727
  
    --- Diff: 
sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/InsertIntoHiveTable.scala
 ---
    @@ -97,12 +97,24 @@ case class InsertIntoHiveTable(
         val inputPathUri: URI = inputPath.toUri
         val inputPathName: String = inputPathUri.getPath
         val fs: FileSystem = inputPath.getFileSystem(hadoopConf)
    -    val stagingPathName: String =
    +    var stagingPathName: String =
           if (inputPathName.indexOf(stagingDir) == -1) {
             new Path(inputPathName, stagingDir).toString
           } else {
             inputPathName.substring(0, inputPathName.indexOf(stagingDir) + 
stagingDir.length)
           }
    +
    +    // SPARK-20594: This is a walk-around fix to resolve a Hive bug. Hive 
requires that the
    +    // staging directory needs to avoid being deleted when users set 
hive.exec.stagingdir
    +    // under the table directory.
    +    if (FileUtils.isSubDir(new Path(stagingPathName), inputPath, fs)
    +      && 
!stagingPathName.stripPrefix(inputPathName).stripPrefix(File.separator).startsWith("."))
 {
    --- End diff --
    
    Nit: Please move `&&` in line 110


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to