[
https://issues.apache.org/jira/browse/SPARK-9621?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17083608#comment-17083608
]
Alexandre Archambault commented on SPARK-9621:
----------------------------------------------
FYI, like in https://issues.apache.org/jira/browse/SPARK-2620, adding "final"
when defining the case class seems to fix that.
{{final case class MyTest(i: Int)}}
Beware that this workaround actually relies on a bug in scala, unfixed as of
scala 2.12.10 / 2.13.1. See the discussion
[here|https://issues.apache.org/jira/browse/SPARK-2620?focusedCommentId=17083606&page=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-17083606].
> Closure inside RDD doesn't properly close over environment
> ----------------------------------------------------------
>
> Key: SPARK-9621
> URL: https://issues.apache.org/jira/browse/SPARK-9621
> Project: Spark
> Issue Type: Bug
> Affects Versions: 1.4.1
> Environment: Ubuntu 15.04, spark-1.4.1-bin-hadoop2.6 package
> Reporter: Joe Near
> Priority: Major
>
> I expect the following:
> case class MyTest(i: Int)
> val tv = MyTest(1)
> val res = sc.parallelize(Array((t: MyTest) => t == tv)).first()(tv)
> to be "true." It is "false," when I type this into spark-shell. It seems the
> closure is changed somehow when it's serialized and deserialized.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]