Github user devaraj-kavali commented on a diff in the pull request:
https://github.com/apache/spark/pull/19616#discussion_r168363672
--- Diff:
resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala
---
@@ -656,7 +664,9 @@ private[spark] class Client(
// Clear the cache-related entries from the configuration to avoid
them polluting the
// UI's environment page. This works for client mode; for cluster
mode, this is handled
// by the AM.
- CACHE_CONFIGS.foreach(sparkConf.remove)
+ if (!isClientUnmanagedAMEnabled) {
--- End diff --
It is clearing the classpath entries and leading to this error in Executors.
```
Error: Could not find or load main class
org.apache.spark.executor.CoarseGrainedExecutorBackend
```
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]