This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch branch-2.4
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-2.4 by this push:
     new 223b9fb  [SPARK-31231][BUILD] Explicitly setuptools version as 46.0.0 
in pip package test
223b9fb is described below

commit 223b9fb1eadeba0e05b1a300512c31c4f99f41e8
Author: HyukjinKwon <gurwls...@apache.org>
AuthorDate: Tue Mar 24 17:59:43 2020 +0900

    [SPARK-31231][BUILD] Explicitly setuptools version as 46.0.0 in pip package 
test
    
    ### What changes were proposed in this pull request?
    
    For a bit of background,
    PIP packaging test started to fail (see [this 
logs](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/120218/testReport/))
 as of  setuptools 46.1.0 release. In 
https://github.com/pypa/setuptools/issues/1424, they decided to don't keep the 
modes in `package_data`.
    
    In PySpark pip installation, we keep the executable scripts in 
`package_data` 
https://github.com/apache/spark/blob/fc4e56a54c15e20baf085e6061d3d83f5ce1185d/python/setup.py#L199-L200,
 and expose their symbolic links as executable scripts.
    
    So, the symbolic links (or copied scripts) executes the scripts copied from 
`package_data`, which doesn't have the executable permission in its mode:
    
    ```
    /tmp/tmp.UmkEGNFdKF/3.6/bin/spark-submit: line 27: 
/tmp/tmp.UmkEGNFdKF/3.6/lib/python3.6/site-packages/pyspark/bin/spark-class: 
Permission denied
    /tmp/tmp.UmkEGNFdKF/3.6/bin/spark-submit: line 27: exec: 
/tmp/tmp.UmkEGNFdKF/3.6/lib/python3.6/site-packages/pyspark/bin/spark-class: 
cannot execute: Permission denied
    ```
    
    The current issue is being tracked at 
https://github.com/pypa/setuptools/issues/2041
    
    </br>
    
    For what this PR proposes:
    It sets the upper bound in PR builder for now to unblock other PRs.  _This 
PR does not solve the issue yet. I will make a fix after monitoring 
https://github.com/pypa/setuptools/issues/2041_
    
    ### Why are the changes needed?
    
    It currently affects users who uses the latest setuptools. So, _users seem 
unable to use PySpark with the latest setuptools._ See also 
https://github.com/pypa/setuptools/issues/2041#issuecomment-602566667
    
    ### Does this PR introduce any user-facing change?
    
    It makes CI pass for now. No user-facing change yet.
    
    ### How was this patch tested?
    
    Jenkins will test.
    
    Closes #27995 from HyukjinKwon/investigate-pip-packaging.
    
    Authored-by: HyukjinKwon <gurwls...@apache.org>
    Signed-off-by: HyukjinKwon <gurwls...@apache.org>
    (cherry picked from commit c181c45f863ba55e15ab9b41f635ffbddad9bac0)
    Signed-off-by: HyukjinKwon <gurwls...@apache.org>
---
 dev/run-pip-tests | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/dev/run-pip-tests b/dev/run-pip-tests
index 60cf4d8..f9cd94d 100755
--- a/dev/run-pip-tests
+++ b/dev/run-pip-tests
@@ -81,7 +81,7 @@ for python in "${PYTHON_EXECS[@]}"; do
     VIRTUALENV_PATH="$VIRTUALENV_BASE"/$python
     rm -rf "$VIRTUALENV_PATH"
     if [ -n "$USE_CONDA" ]; then
-      conda create -y -p "$VIRTUALENV_PATH" python=$python numpy pandas pip 
setuptools
+      conda create -y -p "$VIRTUALENV_PATH" python=$python numpy pandas pip 
setuptools=46.0.0
       source activate "$VIRTUALENV_PATH"
     else
       mkdir -p "$VIRTUALENV_PATH"


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to