[ https://issues.apache.org/jira/browse/SPARK-7595?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen updated SPARK-7595: ----------------------------- Assignee: Weizhong > Window will cause resolve failed with self join > ----------------------------------------------- > > Key: SPARK-7595 > URL: https://issues.apache.org/jira/browse/SPARK-7595 > Project: Spark > Issue Type: Bug > Components: SQL > Reporter: Weizhong > Assignee: Weizhong > Priority: Minor > Fix For: 1.4.0 > > > for example: > table: src(key string, value string) > sql: with v1 as(select key, count(value) over (partition by key) cnt_val from > src), v2 as(select v1.key, v1_lag.cnt_val from v1, v1 v1_lag where v1.key = > v1_lag.key) select * from v2 limit 5; > then will analyze fail when resolving conflicting references in Join: > 'Limit 5 > 'Project [*] > 'Subquery v2 > 'Project ['v1.key,'v1_lag.cnt_val] > 'Filter ('v1.key = 'v1_lag.key) > 'Join Inner, None > Subquery v1 > Project [key#95,cnt_val#94L] > Window [key#95,value#96], > [HiveWindowFunction#org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount(value#96) > WindowSpecDefinition [key#95], [], ROWS BETWEEN UNBOUNDED PRECEDING AND > UNBOUNDED FOLLOWING AS cnt_val#94L], WindowSpecDefinition [key#95], [], ROWS > BETWEEN UNBOUNDED PRECEDING AND UNBOUNDED FOLLOWING > Project [key#95,value#96] > MetastoreRelation default, src, None > Subquery v1_lag > Subquery v1 > Project [key#97,cnt_val#94L] > Window [key#97,value#98], > [HiveWindowFunction#org.apache.hadoop.hive.ql.udf.generic.GenericUDAFCount(value#98) > WindowSpecDefinition [key#97], [], ROWS BETWEEN UNBOUNDED PRECEDING AND > UNBOUNDED FOLLOWING AS cnt_val#94L], WindowSpecDefinition [key#97], [], ROWS > BETWEEN UNBOUNDED PRECEDING AND UNBOUNDED FOLLOWING > Project [key#97,value#98] > MetastoreRelation default, src, None > Conflicting attributes: cnt_val#94L -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org