[
https://issues.apache.org/jira/browse/BIGTOP-1181?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13871345#comment-13871345
]
Sean Mackrory commented on BIGTOP-1181:
---------------------------------------
Good point. A quick attempt to run my tests on RHEL 5 didn't show any immediate
problems incompatibilities with Python 2.4, but that server is screwed up in
other ways and I didn't get the whole to complete - so I should do more testing
there. Spark's docs state that "PySpark requires Python 2.6 or higher."
(http://spark.incubator.apache.org/docs/0.8.0/python-programming-guide.html),
so even if it works it isn't officially supported. I'll add python dependencies
to the packages in my patch. Given the lack of Python 2.4 support, I think we
should break this out into a separate package (so it's not required for every
Spark install) and require "python26" on EL5 (available from EPEL). It's not
ideal but probably the best we can do if we want to at least provide the option
of pyspark.
> Add pyspark to spark package
> ----------------------------
>
> Key: BIGTOP-1181
> URL: https://issues.apache.org/jira/browse/BIGTOP-1181
> Project: Bigtop
> Issue Type: Bug
> Reporter: Sean Mackrory
> Assignee: Sean Mackrory
> Attachments: 0001-BIGTOP-1181.-Add-pyspark-to-spark-package.patch
>
>
--
This message was sent by Atlassian JIRA
(v6.1.5#6160)