[ https://issues.apache.org/jira/browse/SPARK-20963?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16111574#comment-16111574 ]
Hyukjin Kwon commented on SPARK-20963: -------------------------------------- User 'maropu' has created a pull request for this issue: https://github.com/apache/spark/pull/18772 > Support column aliases for aliased relation in FROM clause > ---------------------------------------------------------- > > Key: SPARK-20963 > URL: https://issues.apache.org/jira/browse/SPARK-20963 > Project: Spark > Issue Type: Sub-task > Components: SQL > Affects Versions: 2.1.1 > Reporter: Takeshi Yamamuro > > Currently, we do not support column aliases for aliased relation; > {code} > scala> Seq((1, 2), (2, 0)).toDF("id", "value").createOrReplaceTempView("t1") > scala> Seq((1, 2), (2, 0)).toDF("id", "value").createOrReplaceTempView("t2") > scala> sql("SELECT * FROM (t1 JOIN t2)") > scala> sql("SELECT * FROM (t1 INNER JOIN t2 ON t1.id = t2.id) AS t(a, b, c, > d)").show > org.apache.spark.sql.catalyst.parser.ParseException: > mismatched input '(' expecting {<EOF>, ',', 'WHERE', 'GROUP', 'ORDER', > 'HAVING', 'LIMIT', 'JOIN', 'CROSS', 'INNER', 'LEFT', 'RIGHT', 'FULL', > 'NATURAL', 'LATERAL', 'WINDOW', 'UNION', 'EXCEPT', 'MINUS', 'INTERSECT', > 'SORT', 'CLUSTER', 'DISTRIBUTE', 'ANTI'}(line 1, pos 54) > == SQL == > SELECT * FROM (t1 INNER JOIN t2 ON t1.id = t2.id) AS t(a, b, c, d) > ------------------------------------------------------^^^ > at > org.apache.spark.sql.catalyst.parser.ParseException.withCommand(ParseDriver.scala:217) > at > org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.scala:114) > at org.apache.spark.sql.execution.SparkSqlParser.parse(Spa > {code} > We could support this by referring; > http://docs.aws.amazon.com/redshift/latest/dg/r_FROM_clause30.html -- This message was sent by Atlassian JIRA (v6.4.14#64029) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org