[ 
https://issues.apache.org/jira/browse/SPARK-23965?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16434809#comment-16434809
 ] 

Hyukjin Kwon commented on SPARK-23965:
--------------------------------------

I think that sounds we are going to more make the thridparty library dependent 
on Spark itself. 

Another simple solution I used a long while ago before:

{code}
export PYTHONPATH=$(ZIPS=("$SPARK_HOME"/python/lib/*.zip); IFS=:; echo 
"${ZIPS[*]}"):$PYTHONPATH
{code}


> make python/py4j-src-0.x.y.zip file name Spark version-independent
> ------------------------------------------------------------------
>
>                 Key: SPARK-23965
>                 URL: https://issues.apache.org/jira/browse/SPARK-23965
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark
>    Affects Versions: 2.2.1, 2.3.0, 2.4.0
>            Reporter: Ruslan Dautkhanov
>            Priority: Major
>
> After each Spark release (that's normally packaged with slightly newer 
> version of py4j), we have to adjust our PySpark applications PYTHONPATH to 
> point to correct version of python/py4j-src-0.9.2.zip. 
> Change to python/py4j-src-0.9.2.zip to python/py4j-src-0.9.6.zip, next 
> release to something else etc. 
> Possible solutions. Would be great to either
>  - rename `python/py4j-src-0.x.y.zip` to `python/py4j-src-latest.zip` or 
> `python/py4j-src-current.zip`
>  - or make a symlink in Spark distributed `py4j-src-current.zip` to whatever 
> version Spark is shipped with.
> In either case, if this would be solved, we wouldn't have to adjust 
> PYTHONPATH during upgrades like Spark 2.2 to 2.3.. 
> Thanks.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to