[ https://issues.apache.org/jira/browse/SPARK-20908?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Xiao Li updated SPARK-20908: ---------------------------- Description: In Cache manager, the plan matching should ignore Hint. {noformat} val df1 = spark.range(10).join(broadcast(spark.range(10))) df1.cache() spark.range(10).join(spark.range(10)).explain() {noformat} The output plan of the above query shows that the second query is not using the cached data of the first query. {noformat} BroadcastNestedLoopJoin BuildRight, Inner :- *Range (0, 10, step=1, splits=2) +- BroadcastExchange IdentityBroadcastMode +- *Range (0, 10, step=1, splits=2) {noformat} was: In Cache manager, the plan matching should ignore Hint. {noformat} val df1 = spark.range(10).join(broadcast(spark.range(10))) df1.cache() spark.range(10).join(spark.range(10)).explain() {noformat} The above query shows the plan that does not use the cached data {noformat} BroadcastNestedLoopJoin BuildRight, Inner :- *Range (0, 10, step=1, splits=2) +- BroadcastExchange IdentityBroadcastMode +- *Range (0, 10, step=1, splits=2) {noformat} > Cache Manager: Hint should be ignored in plan matching > ------------------------------------------------------ > > Key: SPARK-20908 > URL: https://issues.apache.org/jira/browse/SPARK-20908 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 2.1.1, 2.2.0 > Reporter: Xiao Li > Assignee: Xiao Li > > In Cache manager, the plan matching should ignore Hint. > {noformat} > val df1 = spark.range(10).join(broadcast(spark.range(10))) > df1.cache() > spark.range(10).join(spark.range(10)).explain() > {noformat} > The output plan of the above query shows that the second query is not using > the cached data of the first query. > {noformat} > BroadcastNestedLoopJoin BuildRight, Inner > :- *Range (0, 10, step=1, splits=2) > +- BroadcastExchange IdentityBroadcastMode > +- *Range (0, 10, step=1, splits=2) > {noformat} -- This message was sent by Atlassian JIRA (v6.3.15#6346) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org