[ 
https://issues.apache.org/jira/browse/SPARK-32419?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hyukjin Kwon reassigned SPARK-32419:
------------------------------------

    Assignee: Hyukjin Kwon

> Leverage Conda environment at pip packaging test in GitHub Actions
> ------------------------------------------------------------------
>
>                 Key: SPARK-32419
>                 URL: https://issues.apache.org/jira/browse/SPARK-32419
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Build, PySpark
>    Affects Versions: 3.1.0
>            Reporter: Hyukjin Kwon
>            Assignee: Hyukjin Kwon
>            Priority: Major
>
> If you take a close look for GitHub Actions log:
> {code:java}
>  Installing dist into virtual env
> Processing ./python/dist/pyspark-3.1.0.dev0.tar.gz
> Collecting py4j==0.10.9
>  Downloading py4j-0.10.9-py2.py3-none-any.whl (198 kB)
> Using legacy setup.py install for pyspark, since package 'wheel' is not 
> installed.
> Installing collected packages: py4j, pyspark
>  Running setup.py install for pyspark: started
>  Running setup.py install for pyspark: finished with status 'done'
> Successfully installed py4j-0.10.9 pyspark-3.1.0.dev0
> ...
> Installing dist into virtual env
> Obtaining file:///home/runner/work/spark/spark/python
> Collecting py4j==0.10.9
>  Downloading py4j-0.10.9-py2.py3-none-any.whl (198 kB)
> Installing collected packages: py4j, pyspark
>  Attempting uninstall: py4j
>  Found existing installation: py4j 0.10.9
>  Uninstalling py4j-0.10.9:
>  Successfully uninstalled py4j-0.10.9
>  Attempting uninstall: pyspark
>  Found existing installation: pyspark 3.1.0.dev0
>  Uninstalling pyspark-3.1.0.dev0:
>  Successfully uninstalled pyspark-3.1.0.dev0
>  Running setup.py develop for pyspark
> Successfully installed py4j-0.10.9 pyspark
> {code}
> It looks not properly using conda as it removes and re-installs again.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to