[ https://issues.apache.org/jira/browse/SPARK-26831?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-26831: ------------------------------------ Assignee: Apache Spark > bin/pyspark: avoid hardcoded `python` command and improve version checks > ------------------------------------------------------------------------ > > Key: SPARK-26831 > URL: https://issues.apache.org/jira/browse/SPARK-26831 > Project: Spark > Issue Type: Improvement > Components: PySpark > Affects Versions: 2.4.0 > Reporter: Stefaan Lippens > Assignee: Apache Spark > Priority: Major > > (this originally started at https://github.com/apache/spark/pull/23736) > I was trying out pyspark on a system with only a {{python3}} command but no > {{python}} command and got this error: > {code} > /opt/spark/bin/pyspark: line 45: python: command not found > {code} > While the pyspark script is full of variables to refer to a python > interpreter there is still a hardcoded {{python}} used for > {code} > WORKS_WITH_IPYTHON=$(python -c 'import sys; print(sys.version_info >= (2, 7, > 0))') > {code} > While looking into this, I also noticed the bash syntax for the IPython > version check is wrong: > {code} > if [[ ! $WORKS_WITH_IPYTHON ]] > {code} > always evaluates to false when {{$WORKS_WITH_IPYTHON}} is non-empty (so in > both cases "True" and "False") -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org