wangyum commented on a change in pull request #24018: [SPARK-23749][SQL] 
Replace built-in Hive API (isSub/toKryo) and remove OrcProto.Type usage
URL: https://github.com/apache/spark/pull/24018#discussion_r265421635
 
 

 ##########
 File path: 
sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/SaveAsHiveFile.scala
 ##########
 @@ -253,6 +253,13 @@ private[hive] trait SaveAsHiveFile extends 
DataWritingCommand {
     dir
   }
 
+  // HIVE-14259 removed FileUtils.isSubDir(). Adapted it from Hive 1.2's 
FileUtils.isSubDir().
 
 Review comment:
   It seems that the bug does not affect Spark:
   ```scala
     test("hive.exec.stagingdir start with table dir") {
       // hive.exec.stagingdir: 
file:/private/var/folders/warehouse-c49050a7/test_table_prefix
       // table dir           : 
file:/private/var/folders/warehouse-c49050a7/test_table
       withSQLConf("hive.exec.stagingdir" -> "../test_table_prefix") {
         withTable("test_table") {
           sql("CREATE TABLE test_table (key int)")
           sql("INSERT OVERWRITE TABLE test_table SELECT 1")
           checkAnswer(sql("SELECT * FROM test_table"), Row(1))
         }
       }
     }
   ```
   But I'd like to fix `isSubDir` itself:
   ```scala
     // HIVE-14259 removed FileUtils.isSubDir(). Adapted it from Hive 1.2's 
FileUtils.isSubDir().
     private def isSubDir(p1: Path, p2: Path, fs: FileSystem): Boolean = {
       val path1 = fs.makeQualified(p1).toString + Path.SEPARATOR
       val path2 = fs.makeQualified(p2).toString + Path.SEPARATOR
       path1.startsWith(path2)
     }
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to