[
https://issues.apache.org/jira/browse/SPARK-9621?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17083639#comment-17083639
]
Guillaume Martres commented on SPARK-9621:
------------------------------------------
> unfixed as of scala 2.12.10 / 2.13.1
... but fixed in Dotty, so it's really not a great idea to rely on that.
> Closure inside RDD doesn't properly close over environment
> ----------------------------------------------------------
>
> Key: SPARK-9621
> URL: https://issues.apache.org/jira/browse/SPARK-9621
> Project: Spark
> Issue Type: Bug
> Affects Versions: 1.4.1
> Environment: Ubuntu 15.04, spark-1.4.1-bin-hadoop2.6 package
> Reporter: Joe Near
> Priority: Major
>
> I expect the following:
> case class MyTest(i: Int)
> val tv = MyTest(1)
> val res = sc.parallelize(Array((t: MyTest) => t == tv)).first()(tv)
> to be "true." It is "false," when I type this into spark-shell. It seems the
> closure is changed somehow when it's serialized and deserialized.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]