Github user mridulm commented on a diff in the pull request:
https://github.com/apache/spark/pull/22192#discussion_r216118263
--- Diff: core/src/main/scala/org/apache/spark/executor/Executor.scala ---
@@ -136,6 +136,26 @@ private[spark] class Executor(
// for fetching remote cached RDD blocks, so need to make sure it uses
the right classloader too.
env.serializerManager.setDefaultClassLoader(replClassLoader)
+ private val executorPlugins: Seq[ExecutorPlugin] = {
+ val pluginNames = conf.get(EXECUTOR_PLUGINS)
+ if (pluginNames.nonEmpty) {
+ logDebug(s"Initializing the following plugins:
${pluginNames.mkString(", ")}")
+
+ // Plugins need to load using a class loader that includes the
executor's user classpath
+ val pluginList: Seq[ExecutorPlugin] =
+ Utils.withContextClassLoader(replClassLoader) {
+ val plugins = Utils.loadExtensions(classOf[ExecutorPlugin],
pluginNames, conf)
+ plugins.foreach(_.init())
--- End diff --
On second thoughts, I am not sure if the latter should fail the executor
... for example if plugin is unable to write a file, etc
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]