Github user weineran commented on a diff in the pull request:
https://github.com/apache/spark/pull/10869#discussion_r50706575
--- Diff: docs/configuration.md ---
@@ -1700,6 +1700,8 @@ to use on each machine and maximum memory.
Since `spark-env.sh` is a shell script, some of these can be set
programmatically -- for example, you might
compute `SPARK_LOCAL_IP` by looking up the IP of a specific network
interface.
+Note: When running Spark on YARN in cluster mode, environment variables
need to be set using the
<code>spark.yarn.appMasterEnv.[EnvironmentVariableName]</code> property in your
`conf/spark-defaults.conf` file. Environment variables that are set in
`spark-env.sh` will not be reflected in the YARN Application Master process in
cluster mode. See the [YARN-related Spark
Properties](running-on-yarn.html#spark-properties) for more information.
--- End diff --
Ah well that could be. Does the <code> tag only get used in tables? Let
me know if I should switch to back-ticks.
Just realized I should throw some back-ticks around `cluster` too.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]