Github user JoshRosen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/14396#discussion_r72839076
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
    @@ -1430,14 +1430,10 @@ class SparkContext(config: SparkConf) extends 
Logging with ExecutorAllocationCli
           schemeCorrectedPath
         }
         val timestamp = System.currentTimeMillis
    -    addedFiles(key) = timestamp
    -
    -    // Fetch the file locally in case a job is executed using 
DAGScheduler.runLocally().
    -    Utils.fetchFile(path, new File(SparkFiles.getRootDirectory()), conf, 
env.securityManager,
    --- End diff --
    
    I believe that this line is unnecessary now that `runLocally` was removed a 
few releases ago. However, I suppose that we might need it in order to handle 
the somewhat obscure corner-case where a user's `reduce` function / closure 
executes on the driver and accesses the `SparkFiles` object.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to