Dilip Biswal created SPARK-19993:
------------------------------------
Summary: Caching logical plans containing subquery expressions
does not work.
Key: SPARK-19993
URL: https://issues.apache.org/jira/browse/SPARK-19993
Project: Spark
Issue Type: Sub-task
Components: SQL
Affects Versions: 2.1.0
Reporter: Dilip Biswal
Here is a simple repro that depicts the problem. In this case the second
invocation of the sql should have been from the cache. However the lookup fails
currently.
{code}
scala> val ds = spark.sql("select * from s1 where s1.c1 in (select s2.c1 from
s2 where s1.c1 = s2.c1)")
ds: org.apache.spark.sql.DataFrame = [c1: int]
scala> ds.cache
res13: ds.type = [c1: int]
scala> spark.sql("select * from s1 where s1.c1 in (select s2.c1 from s2 where
s1.c1 = s2.c1)").explain(true)
== Analyzed Logical Plan ==
c1: int
Project [c1#86]
+- Filter c1#86 IN (list#78 [c1#86])
: +- Project [c1#87]
: +- Filter (outer(c1#86) = c1#87)
: +- SubqueryAlias s2
: +- Relation[c1#87] parquet
+- SubqueryAlias s1
+- Relation[c1#86] parquet
== Optimized Logical Plan ==
Join LeftSemi, ((c1#86 = c1#87) && (c1#86 = c1#87))
:- Relation[c1#86] parquet
+- Relation[c1#87] parquet
{code}
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]