dongjoon-hyun edited a comment on issue #28232: [SPARK-31459][SQL]fix insert overwrite directory target path is an existing file URL: https://github.com/apache/spark/pull/28232#issuecomment-615548088 Ur, @mcdull-zhang . When I try at Apache Spark 3.0.0-RC1 and 2.4.5, I get the followings. For me, the current behavior looks correct to me. **3.0.0-RC1** ```scala scala> spark.version res1: String = 3.0.0 scala> sql("INSERT OVERWRITE LOCAL DIRECTORY '/tmp/p1' STORED AS ORC SELECT 1").show java.lang.RuntimeException: Cannot create staging directory 'file:/tmp/p1/.hive-staging_hive_2020-04-18_03-27-45_173_5810881255552839519-1': Parent path is not a directory: file:/tmp/p1 ``` **2.4.5** ``` scala> spark.version res1: String = 2.4.5 scala> sql("INSERT OVERWRITE LOCAL DIRECTORY '/tmp/p1' STORED AS ORC SELECT 1").show java.lang.RuntimeException: Cannot create staging directory 'file:/tmp/p1/.hive-staging_hive_2020-04-18_03-36-22_566_4239182152715274882-1': Parent path is not a directory: file:/tmp/p1 ``` Could you give me an example how you got the final operation incorrect result? > When using the insert overwrite directory syntax, if the target path is an existing file, the final operation result is incorrect.
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
