HyukjinKwon commented on a change in pull request #34315:
URL: https://github.com/apache/spark/pull/34315#discussion_r731470851



##########
File path: python/docs/source/getting_started/install.rst
##########
@@ -83,46 +83,54 @@ Note that this installation way of PySpark with/without a 
specific Hadoop versio
 Using Conda
 -----------
 
-Conda is an open-source package management and environment management system 
which is a part of
-the `Anaconda <https://docs.continuum.io/anaconda/>`_ distribution. It is both 
cross-platform and
-language agnostic. In practice, Conda can replace both `pip 
<https://pip.pypa.io/en/latest/>`_ and
-`virtualenv <https://virtualenv.pypa.io/en/latest/>`_.
-
-Create new virtual environment from your terminal as shown below:
-
-.. code-block:: bash
-
-    conda create -n pyspark_env
-
-After the virtual environment is created, it should be visible under the list 
of Conda environments
-which can be seen using the following command:
+Conda is an open-source package management and environment management system 
(developed by
+`Anaconda <https://www.anaconda.com/>`_), which is best installed through
+`Miniconda <https://docs.conda.io/en/latest/miniconda.html/>`_ or `Miniforge 
<https://github.com/conda-forge/miniforge/>`_.
+The tool is both cross-platform and language agnostic, and in practice, conda 
can replace both
+`pip <https://pip.pypa.io/en/latest/>`_ and `virtualenv 
<https://virtualenv.pypa.io/en/latest/>`_.
+
+Conda uses so-called channels to distribute packages, and together with the 
default channels by
+Anaconda itself, the most important channel is `conda-forge 
<https://conda-forge.org/>`_, which
+is the community-driven packaging effort that is the most extensive & the most 
current (and also
+serves as the upstream for the Anaconda channels in most cases).
+
+Generally, it is recommended to use *as few channels as possible*. Conda-forge 
& Anaconda put a
+lot of effort in guaranteeing binary compatibility between packages (e.g. by 
using compatible
+compilers for all packages and tracking which packages are ABI-relevant). 
Needlessly mixing in
+other channels can end up breaking those guarantees, which is why conda-forge 
even recommends
+so-called "strict channel priority":
 
 .. code-block:: bash
 
-    conda env list
+    conda config --add channels conda-forge
+    conda config --set channel_priority strict
 
-Now activate the newly created environment with the following command:
+To create a new conda environment from your terminal and activate it, proceed 
as shown below:
 
 .. code-block:: bash
 
+    conda create -n pyspark_env
     conda activate pyspark_env
 
-You can install pyspark by `Using PyPI <#using-pypi>`_ to install PySpark in 
the newly created
-environment, for example as below. It will install PySpark under the new 
virtual environment
-``pyspark_env`` created above.
+After activating the environment, use the following command to install pyspark,
+a python version of your choice, as well as other packages you want to use in
+the same session as pyspark (you can install in several steps too).
 
 .. code-block:: bash
 
-    pip install pyspark
+    conda install -c conda-forge pyspark python [other packages]  # can also 
use python=3.8, etc.

Review comment:
       That's fine. Let's encourage using Conda here which I agree. The problem 
is:
   - When the release was been unsynced, and it's only available on pip, using 
conda would be inevitable. This is my major concern.
   - Here we will focus on installation steps over explaining the relationship 
between Conda and pip.
   




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to