itholic commented on a change in pull request #34903:
URL: https://github.com/apache/spark/pull/34903#discussion_r769236345
##########
File path: docs/configuration.md
##########
@@ -3029,7 +3029,10 @@ can be found on the pages for each mode:
Certain Spark settings can be configured through environment variables, which
are read from the
`conf/spark-env.sh` script in the directory where Spark is installed (or
`conf/spark-env.cmd` on
Windows). In Standalone and Mesos modes, this file can give machine specific
information such as
-hostnames. It is also sourced when running local Spark applications or
submission scripts.
+hostnames. It is also sourced when running local Spark applications or
submission scripts. For
+pyspark applications, the environment variable
`_PYSPARK_DRIVER_SYS_EXECUTABLE` will be set to
+the python interpreter's `sys.executable`, which will allow further
customization based on the
+user's virtual environment.
Review comment:
nit: Can we use official naming for Python and PySpark ??
- pyspark -> PySpark
- python -> Python
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]