This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 8084318f25b [SPARK-41074][DOC] Add option `--upgrade` in dependency 
installation command
8084318f25b is described below

commit 8084318f25bca0f66de404ea8c258279f1012974
Author: Ruifeng Zheng <ruife...@apache.org>
AuthorDate: Wed Nov 9 20:06:02 2022 +0900

    [SPARK-41074][DOC] Add option `--upgrade` in dependency installation command
    
    ### What changes were proposed in this pull request?
    Add option `--upgrade` in dependency installation command
    
    ### Why are the changes needed?
    
    for the packages whose version are not pinned, `pip install -r 
dev/requirements.txt` can not upgrade them
    
    ### Does this PR introduce _any_ user-facing change?
    No
    
    ### How was this patch tested?
    manually check
    
    Closes #38581 from zhengruifeng/infra_pip.
    
    Authored-by: Ruifeng Zheng <ruife...@apache.org>
    Signed-off-by: Hyukjin Kwon <gurwls...@apache.org>
---
 docs/README.md                                  | 2 +-
 python/docs/source/development/contributing.rst | 4 ++--
 2 files changed, 3 insertions(+), 3 deletions(-)

diff --git a/docs/README.md b/docs/README.md
index 27238964f0a..4b788dbc79d 100644
--- a/docs/README.md
+++ b/docs/README.md
@@ -61,7 +61,7 @@ See also https://issues.apache.org/jira/browse/SPARK-35375.
 -->
 Run the following command from $SPARK_HOME:
 ```sh
-$ sudo pip install -r dev/requirements.txt
+$ sudo pip install --upgrade -r dev/requirements.txt
 ```
 
 ### R API Documentation (Optional)
diff --git a/python/docs/source/development/contributing.rst 
b/python/docs/source/development/contributing.rst
index 3d388e91012..88f7b3a7b43 100644
--- a/python/docs/source/development/contributing.rst
+++ b/python/docs/source/development/contributing.rst
@@ -130,7 +130,7 @@ If you are using Conda, the development environment can be 
set as follows.
     # Python 3.7+ is required
     conda create --name pyspark-dev-env python=3.9
     conda activate pyspark-dev-env
-    pip install -r dev/requirements.txt
+    pip install --upgrade -r dev/requirements.txt
 
 Once it is set up, make sure you switch to `pyspark-dev-env` before starting 
the development:
 
@@ -147,7 +147,7 @@ With Python 3.7+, pip can be used as below to install and 
set up the development
 
 .. code-block:: bash
 
-    pip install -r dev/requirements.txt
+    pip install --upgrade -r dev/requirements.txt
 
 Now, you can start developing and `running the tests <testing.rst>`_.
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to