Github user NiharS commented on a diff in the pull request:
https://github.com/apache/spark/pull/22192#discussion_r213828422
--- Diff: core/src/main/scala/org/apache/spark/executor/Executor.scala ---
@@ -130,6 +130,14 @@ private[spark] class Executor(
private val urlClassLoader = createClassLoader()
private val replClassLoader = addReplClassLoaderIfNeeded(urlClassLoader)
+ // Load plugins in the current thread, they are expected to not block.
+ // Heavy computation in plugin initialization should be done async.
+ Thread.currentThread().setContextClassLoader(replClassLoader)
+ conf.get(EXECUTOR_PLUGINS).foreach { classes =>
--- End diff --
@vanzin If I'm understanding correctly, in the ConfigBuilder I am to change
`.createOptional` to `.createWithDefault(Nil)`. This gives the error:
`[error]
/Users/nsheth/personal_fork/spark/core/src/main/scala/org/apache/spark/executor/Executor.scala:136:
type mismatch;
[error] found : String
[error] required: Seq[String]
[error] Utils.loadExtensions(classOf[ExecutorPlugin], classes, conf)`
Granted I have no idea why it's able to cast a Nil to a string for this
config.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]