[ https://issues.apache.org/jira/browse/SPARK-44714?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17751966#comment-17751966 ]
ASF GitHub Bot commented on SPARK-44714: ---------------------------------------- User 'anchovYu' has created a pull request for this issue: https://github.com/apache/spark/pull/42276 > Ease restriction of LCA resolution regarding queries with having > ---------------------------------------------------------------- > > Key: SPARK-44714 > URL: https://issues.apache.org/jira/browse/SPARK-44714 > Project: Spark > Issue Type: Sub-task > Components: SQL > Affects Versions: 3.4.2, 3.5.0 > Reporter: Xinyi Yu > Priority: Major > > Current LCA resolution has a limitation, that it can't resolve the query, > when it satisfies all the following criteria: > # the main (outer) query has having clause > # there is a window expression in the query > # in the same SELECT list as the window expression in 2), there is an lca > This is because LCA won't rewrite plan until UNRESOLVED_HAVING is resolved; > window expressions won't be extracted until LCA in the same SELECT lists are > rewritten; however UNRESOLVED_HAVING depends on the child to be resolved, > which could include the Window. It becomes a deadlock. > *We should ease some limitation on the LCA resolution regarding to having, to > break the deadlock for most cases.* > For example, for the following query: > {code:java} > create table t (col boolean) using orc; > with w AS ( > select min(col) over () as min_alias, > min_alias as col_alias > FROM t > ) > select col_alias > from w > having count > 0; > {code} > > It now throws confusing error message: > {code:java} > [UNRESOLVED_COLUMN.WITH_SUGGESTION] A column or function parameter with name > `col_alias` > cannot be resolved. Did you mean one of the following? [`col_alias`, > `min_alias`].{code} > The LCA and window is in a CTE that is completely unrelated to the having. > LCA should resolve in this case. -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org