Github user ScrapCodes commented on a diff in the pull request:
https://github.com/apache/spark/pull/17357#discussion_r134985214
--- Diff:
core/src/main/scala/org/apache/spark/deploy/rest/StandaloneRestServer.scala ---
@@ -139,7 +139,9 @@ private[rest] class StandaloneSubmitRequestServlet(
val driverExtraLibraryPath =
sparkProperties.get("spark.driver.extraLibraryPath")
val superviseDriver = sparkProperties.get("spark.driver.supervise")
val appArgs = request.appArgs
- val environmentVariables = request.environmentVariables
+ // Filter SPARK_LOCAL environment variables from being set on the
remote system.
+ val environmentVariables =
+
request.environmentVariables.filterNot(_._1.startsWith("SPARK_LOCAL"))
--- End diff --
You are right, but shouldn't all SPARK_LOCAL* properties be picked up from
the local environment of the node where driver is going to be started?
Not filtering them, would mean, that these local properties are common to
all nodes.
But for this particular bug, `SPARK_LOCAL_DIRS` is not required to be
filtered.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]