[ https://issues.apache.org/jira/browse/SPARK-37500?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17488649#comment-17488649 ]
Lauri Koobas commented on SPARK-37500: -------------------------------------- This particular case is fixed indeed, but I believe it was fixed in a bad way. This works: (notice the JOIN is commented out): {noformat} create or replace temporary view test_temp_view as with step_1 as ( select * , min(a) over w2 as min_a_over_w1 from (select 1 as a, 2 as b, 3 as c) window w2 as (partition by b order by c) ) , step_2 as ( select * from (select 1 as e, 2 as f, 3 as g) --join step_1 on true window w1 as (partition by f order by g) ) select * from step_2 {noformat} This doesn't: {noformat} create or replace temporary view test_temp_view as with step_1 as ( select * , min(a) over w2 as min_a_over_w1 from (select 1 as a, 2 as b, 3 as c) window w2 as (partition by b order by c) ) , step_2 as ( select * from (select 1 as e, 2 as f, 3 as g) join step_1 on true window w1 as (partition by f order by g) ) select * from step_2 {noformat} It seems that JOIN-ng one CTE with a named window to another CTE with a named window will overwrite/clear some scope. Somehow it also ONLY applies when creating a view, but not when just running the query. > Incorrect scope when using named_windows in CTEs > ------------------------------------------------ > > Key: SPARK-37500 > URL: https://issues.apache.org/jira/browse/SPARK-37500 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 3.1.2, 3.2.0 > Environment: Databricks Runtime 9.0, 9.1, 10.0 > Reporter: Lauri Koobas > Priority: Major > > This works, but shouldn't. The named_window is described outside the CTE that > uses it. > {code:sql} > with step_1 as ( > select * > , min(a) over w1 as min_a_over_w1 > from (select 1 as a, 2 as b, 3 as c) > ) > select * > from step_1 > window w1 as (partition by b order by c) > {code} > > -- This message was sent by Atlassian Jira (v8.20.1#820001) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org