yutoacts commented on a change in pull request #33777:
URL: https://github.com/apache/spark/pull/33777#discussion_r691749052
##########
File path: docs/configuration.md
##########
@@ -3075,7 +3075,7 @@ to use on each machine and maximum memory.
Since `spark-env.sh` is a shell script, some of these can be set
programmatically -- for example, you might
compute `SPARK_LOCAL_IP` by looking up the IP of a specific network interface.
-Note: When running Spark on YARN in `cluster` mode, environment variables need
to be set using the `spark.yarn.appMasterEnv.[EnvironmentVariableName]`
property in your `conf/spark-defaults.conf` file. Environment variables that
are set in `spark-env.sh` will not be reflected in the YARN Application Master
process in `cluster` mode. See the [YARN-related Spark
Properties](running-on-yarn.html#spark-properties) for more information.
Review comment:
I think I totally misunderstood what it says.. Thank you for the
correction.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]