[ https://issues.apache.org/jira/browse/SPARK-15238?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15277468#comment-15277468 ]
Apache Spark commented on SPARK-15238: -------------------------------------- User 'nchammas' has created a pull request for this issue: https://github.com/apache/spark/pull/13017 > Clarify Python 3 support in docs > -------------------------------- > > Key: SPARK-15238 > URL: https://issues.apache.org/jira/browse/SPARK-15238 > Project: Spark > Issue Type: Improvement > Components: Documentation, PySpark > Reporter: Nicholas Chammas > Priority: Trivial > > The [current doc|http://spark.apache.org/docs/1.6.1/] reads: > {quote} > Spark runs on Java 7+, Python 2.6+ and R 3.1+. For the Scala API, Spark 1.6.1 > uses Scala 2.10. You will need to use a compatible Scala version (2.10.x). > {quote} > Projects that support Python 3 generally mention that explicitly. A casual > Python user might assume from this line that Spark supports Python 2.6 and > 2.7 but not 3+. > More specifically, I gather from SPARK-4897 that Spark actually supports 3.4+ > and not earlier versions of Python 3. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org