[ https://issues.apache.org/jira/browse/SPARK-32101?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-32101. ---------------------------------- Resolution: Incomplete > The name in the with clause when it is same as table name. And when that > table name is used in the other places, it is not taking the table, it is > considering the with clause. > ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- > > Key: SPARK-32101 > URL: https://issues.apache.org/jira/browse/SPARK-32101 > Project: Spark > Issue Type: Bug > Components: Spark Core, SQL > Affects Versions: 2.2.1 > Reporter: Pavan Kothamasu > Priority: Minor > Labels: spark, spark-sql, sql > Original Estimate: 1,680h > Remaining Estimate: 1,680h > > The name in the with clause when it is same as table name. And when that > table name is used in the other places, it is not taking the table, it is > considering the with clause sub table. The example is given below with > explanation: > database1.sample structure: > columns: id,name > > with sample as ( > select id, 1 as cnt from database1.sample), > with amp as( > select name,id from database1sample) > select * from sample inner join amp on amp.id=sample.id > ; > > In this example, second alias will fail for missing "name" column, even > though the database1.sample table has "name" column. Becuase it is referring > column "name" from first alias sample. Because the tablename and alias name > are same here and this error came. > The bug is it should not take the alias even i have given databasename along > with tablename. It should consider from metadata of the table not from alias. > Here the solution is tablename should not be alias name. The first alias > should be sample1 or some thing like that. -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org