Could you rebuild the whole project? I changed the python function
serialization format in https://github.com/apache/spark/pull/11535 to fix a
bug. This exception looks like some place was still using the old codes.
On Sun, Mar 6, 2016 at 6:24 PM, Hyukjin Kwon wrote:
> Just in case, My python ve
Just in case, My python version is 2.7.10.
2016-03-07 11:19 GMT+09:00 Hyukjin Kwon :
> Hi all,
>
> While I am testing some codes in PySpark, I met a weird issue.
>
> This works fine at Spark 1.6.0 but it looks it does not for Spark 2.0.0.
>
> When I simply run *logData = sc.textFile(path).coalesc
Hi all,
While I am testing some codes in PySpark, I met a weird issue.
This works fine at Spark 1.6.0 but it looks it does not for Spark 2.0.0.
When I simply run *logData = sc.textFile(path).coalesce(1) *with some big
files in stand-alone local mode without HDFS, this simply throws the
exception