Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22504#discussion_r226456519
  
    --- Diff: 
core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala ---
    @@ -117,6 +118,13 @@ private[history] class FsHistoryProvider(conf: 
SparkConf, clock: Clock)
       // Visible for testing
       private[history] val fs: FileSystem = new 
Path(logDir).getFileSystem(hadoopConf)
     
    +  private val driverLogFs: Option[FileSystem] =
    +    if (conf.get(DRIVER_LOG_DFS_DIR).isDefined) {
    +      Some(FileSystem.get(hadoopConf))
    --- End diff --
    
    This is not right. It assumes the directory defined is in the defaultFS. 
See call right above this one for what you should do here.
    
    I also don't see a good reason to keep this in a field. Just get the FS 
reference when the log cleaner runs.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to