Github user mccheah commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21923#discussion_r209805442
  
    --- Diff: core/src/main/scala/org/apache/spark/executor/Executor.scala ---
    @@ -130,6 +130,12 @@ private[spark] class Executor(
       private val urlClassLoader = createClassLoader()
       private val replClassLoader = addReplClassLoaderIfNeeded(urlClassLoader)
     
    +  Thread.currentThread().setContextClassLoader(replClassLoader)
    +  conf.get(EXECUTOR_PLUGINS).foreach { classes =>
    +    Utils.loadExtensions(classOf[ExecutorPlugin], classes, conf)
    --- End diff --
    
    For all cluster managers would this properly load plugins deployed via 
`--jars` in spark-submit or `spark.jars` in the SparkConf? I know that jar 
deployment and when they're available on the classpath may sometimes vary. 
Although worst case this seems like the kind of thing one may prefer to put in 
`spark.executor.extraClassPath` simply because those jars are guaranteed to be 
loaded at JVM boot time.
    
    In fact - I wonder if we should even move this extension loading further up 
in the lifecycle, simply so that the plugin can be around for a larger 
percentage of the executor JVM's uptime.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to