The Python installed in your cluster is 2.5. You need at least 2.6.
Eric Friedman
On Dec 30, 2014, at 7:45 AM, Jaggu jagana...@gmail.com wrote:
Hi Team,
I was trying to execute a Pyspark code in cluster. It gives me the following
error. (Wne I run the same job in local it is
Hi
I am using Aanonda Python. Is there any way to specify the Python which we
have o use for running pyspark in a cluster.
Best regards
Jagan
On Tue, Dec 30, 2014 at 6:27 PM, Eric Friedman eric.d.fried...@gmail.com
wrote:
The Python installed in your cluster is 2.5. You need at least 2.6.
To configure the Python executable used by PySpark, see the Using the
Shell Python section in the Spark Programming Guide:
https://spark.apache.org/docs/latest/programming-guide.html#using-the-shell
You can set the PYSPARK_PYTHON environment variable to choose the Python
executable that will be