Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16451#discussion_r94575429
  
    --- Diff: 
external/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaTestUtils.scala
 ---
    @@ -138,10 +139,15 @@ class KafkaTestUtils extends Logging {
     
         if (server != null) {
           server.shutdown()
    +      server.awaitShutdown()
           server = null
         }
     
    -    brokerConf.logDirs.foreach { f => Utils.deleteRecursively(new File(f)) 
}
    +    // On Windows, `logDirs` is left open even after Kafka server above is 
completely shut-downed
    +    // in some cases. It leads to test failures on Windows if these are 
not ignored.
    +    brokerConf.logDirs.map(new File(_))
    +      .filterNot(FileUtils.deleteQuietly)
    --- End diff --
    
    Hm, I see. I wonder why it fails to delete? that seems like another general 
problem we'd have to fix for windows because deleteRecursively is used many 
places in the code. I suppose it's possible to have the recursive delete 
continue even if one dir can't be deleted? does it need to delete the contents 
first?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to