[ https://issues.apache.org/jira/browse/SPARK-25037?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16571047#comment-16571047 ]
Hyukjin Kwon commented on SPARK-25037: -------------------------------------- If it's an actual issue after the discussion in the mailing list, let's reopen this JIRA. > plan.transformAllExpressions() doesn't transform expressions in nested > SubqueryExpression plans > ----------------------------------------------------------------------------------------------- > > Key: SPARK-25037 > URL: https://issues.apache.org/jira/browse/SPARK-25037 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 2.3.1 > Reporter: Chris O'Hara > Priority: Minor > > Given the following LogicalPlan: > {code:java} > scala> val plan = spark.sql("SELECT 1 bar FROM (SELECT 1 foo) WHERE foo IN > (SELECT 1 foo)").queryExecution.logical > plan: org.apache.spark.sql.catalyst.plans.logical.LogicalPlan = > 'Project [1 AS bar#29] > +- 'Filter 'foo IN (list#31 []) > : +- Project [1 AS foo#30] > : +- OneRowRelation > +- SubqueryAlias __auto_generated_subquery_name > +- Project [1 AS foo#28] > +- OneRowRelation > {code} > the following transformation should replace all instances of lit(1) with > lit(2): > {code:java} > scala> plan.transformAllExpressions { case l @ Literal(1, _) => l.copy(value > = 2) } > res0: plan.type = > 'Project [2 AS bar#29] > +- 'Filter 'foo IN (list#31 []) > : +- Project [1 AS foo#30] > : +- OneRowRelation > +- SubqueryAlias __auto_generated_subquery_name > +- Project [2 AS foo#28] > +- OneRowRelation > {code} > Instead, the nested SubqueryExpression plan is not transformed. > The expected output is: > {code:java} > 'Project [2 AS bar#29] > +- 'Filter 'foo IN (list#31 []) > : +- Project [2 AS foo#30] > : +- OneRowRelation > +- SubqueryAlias __auto_generated_subquery_name > +- Project [2 AS foo#28] > +- OneRowRelation > {code} > > -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org