ueshin commented on a change in pull request #30436:
URL: https://github.com/apache/spark/pull/30436#discussion_r527364178



##########
File path: python/docs/source/getting_started/install.rst
##########
@@ -68,8 +68,13 @@ It is recommended to use ``-v`` option in ``pip`` to track 
the installation and
 
     HADOOP_VERSION=2.7 pip install pyspark -v
 
-Supported versions of Hadoop are ``HADOOP_VERSION=2.7`` and 
``HADOOP_VERSION=3.2`` (default).
-Note that this installation of PySpark with a different version of Hadoop is 
experimental. It can change or be removed between minor releases.
+Supported values in ``HADOOP_VERSION`` are:
+
+- ``without``: Spark pre-built with user-provided Apache Hadoop
+- ``2.7``: Spark pre-built for Apache Hadoop 2.7
+- ``3.2``: Spark pre-built for Apache Hadoop 3.2 and later (default)

Review comment:
       Awesome!




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to