We're running Python 2.6.6 here but we're looking to upgrade to 2.7.x  in a
month.  

Does pyspark work by converting Python into Java Bytecode, or does it run
Python natively?  

And along those lines, if we're running in yarn-client mode, would we have
to upgrade just the edge node version of Python, or every node in the
cluster?



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Does-Python-2-7-have-to-be-installed-on-every-cluster-node-tp22945.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to