LuciferYang commented on code in PR #40737:
URL: https://github.com/apache/spark/pull/40737#discussion_r1162361532


##########
sql/core/src/test/scala/org/apache/spark/sql/execution/command/DDLSuite.scala:
##########
@@ -2071,7 +2071,10 @@ abstract class DDLSuite extends QueryTest with 
DDLSuiteBase {
   }
 
   test(s"Add a directory when 
${SQLConf.LEGACY_ADD_SINGLE_FILE_IN_ADD_FILE.key} set to false") {
-    val directoryToAdd = Utils.createTempDir("/tmp/spark/addDirectory/")
+    // SPARK-43093: Don't use `withTempDir` to clean up temp dir, it will 
cause test cases in
+    // shared session that need to execute `Executor.updateDependencies` test 
fail.
+    val directoryToAdd = Utils.createDirectory(
+      root = Utils.createTempDir().getCanonicalPath, namePrefix = 
"addDirectory")

Review Comment:
   If use `withTempDir` other tests in the shared session will fail. for 
example:
   
   ```
   info] - Database Ownership *** FAILED *** (83 milliseconds)
   [info]   org.apache.spark.SparkException: Job aborted due to stage failure: 
Task 0 in stage 300.0 failed 1 times, most recent failure: Lost task 0.0 in 
stage 300.0 (TID 291) (localhost executor driver): 
org.apache.spark.SparkException: File 
/${tmp-root-dir}/spark-48633fe9-cd8c-41c7-afb1-1ef32788d898/userFiles-d41161dc-f90f-4602-8191-816c066d56e6/spark-1edc9e26-5bfb-4660-b87c-5d637903f83c
 exists and does not match contents of 
file:/${tmp-root-dir}/spark-1edc9e26-5bfb-4660-b87c-5d637903f83c/
   [info]  at org.apache.spark.util.Utils$.copyFile(Utils.scala:720)
   [info]  at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:813)
   [info]  at org.apache.spark.util.Utils$.fetchFile(Utils.scala:555)
   [info]  at 
org.apache.spark.executor.Executor.$anonfun$updateDependencies$5(Executor.scala:1033)
   [info]  at 
org.apache.spark.executor.Executor.$anonfun$updateDependencies$5$adapted(Executor.scala:1030)
   [info]  at 
scala.collection.TraversableLike$WithFilter.$anonfun$foreach$1(TraversableLike.scala:985)
   [info]  at 
scala.collection.mutable.HashMap.$anonfun$foreach$1(HashMap.scala:149)
   [info]  at 
scala.collection.mutable.HashTable.foreachEntry(HashTable.scala:237)
   [info]  at 
scala.collection.mutable.HashTable.foreachEntry$(HashTable.scala:230)
   [info]  at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:44)
   [info]  at scala.collection.mutable.HashMap.foreach(HashMap.scala:149)
   [info]  at 
scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:984)
   [info]  at 
org.apache.spark.executor.Executor.updateDependencies(Executor.scala:1030)
   [info]  at 
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:511)
   [info]  at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
   [info]  at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
   [info]  at java.lang.Thread.run(Thread.java:750)
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to