[
https://issues.apache.org/jira/browse/SPARK-12061?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15940047#comment-15940047
]
Takeshi Yamamuro commented on SPARK-12061:
------------------------------------------
It seems this issue still happens in v2.1 and the master, so I'll update the
affected version to v2.1.
{code}
scala> val f = (i: Int) => i + 1
scala> val ds = Seq(1, 2, 3).toDS()
scala> val mapped = ds.map(f)
scala> mapped.cache()
scala> val mapped2 = ds.map(f)
scala> mapped2.explain
== Physical Plan ==
*SerializeFromObject [input[0, int, false] AS value#16]
+- *MapElements <function1>, obj#15: int
+- *DeserializeToObject value#1: int, obj#14: int
+- LocalTableScan [value#1]
{code}
> Persist for Map/filter with Lambda Functions don't always read from Cache
> -------------------------------------------------------------------------
>
> Key: SPARK-12061
> URL: https://issues.apache.org/jira/browse/SPARK-12061
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.6.0
> Reporter: Xiao Li
>
> So far, the existing caching mechanisms do not work on dataset operations
> when using map/filter with lambda functions. For example,
> {code}
> test("persist and then map/filter with lambda functions") {
> val f = (i: Int) => i + 1
> val ds = Seq(1, 2, 3).toDS()
> val mapped = ds.map(f)
> mapped.cache()
> val mapped2 = ds.map(f)
> assertCached(mapped2)
> }
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]