[
https://issues.apache.org/jira/browse/SPARK-7909?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14563352#comment-14563352
]
Davies Liu commented on SPARK-7909:
-----------------------------------
[~meawoppl] It's true that some tools don't work with Python3, but PySpark 1.4
should work with Python3 (it's tested with Python3 for every PR), you can test
with the 1.4-RC2 package.
The loggings didn't say something wrong with Python, It said the executors are
keep dying. Could you check the logging of executors?
> spark-ec2 and associated tools not py3 ready
> --------------------------------------------
>
> Key: SPARK-7909
> URL: https://issues.apache.org/jira/browse/SPARK-7909
> Project: Spark
> Issue Type: Improvement
> Components: EC2
> Environment: ec2 python3
> Reporter: Matthew Goodman
>
> At present there is not a possible permutation of tools that supports Python3
> on both the launching computer and running cluster. There are a couple
> problems involved:
> - There is no prebuilt spark binary with python3 support.
> - spark-ec2/spark/init.sh contains inline py3 unfriendly print statements
> - Config files for cluster processes don't seem to make it to all nodes in a
> working format.
> I have fixes for some of this, but the config and running context debugging
> remains elusive to me.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]