[ https://issues.apache.org/jira/browse/SPARK-19778?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15889700#comment-15889700 ]
Takeshi Yamamuro commented on SPARK-19778: ------------------------------------------ {code} scala> Seq(("a", 0), ("b", 1)).toDF("key", "value").createOrReplaceTempView("t") scala> sql("SELECT key AS key1 FROM t GROUP BY key1") org.apache.spark.sql.AnalysisException: cannot resolve '`key1`' given input columns: [key, value]; line 1 pos 35; 'Aggregate ['key1], [key#15 AS key1#21] +- SubqueryAlias t +- Project [_1#12 AS key#15, _2#13 AS value#16] +- LocalRelation [_1#12, _2#13] at org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:42) at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1$$anonfun$apply$2.applyOrElse(CheckAnalysis.scala:75) at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1$$anonfun$apply$2.applyOrElse(CheckAnalysis.scala:72) at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:289) at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:289) at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:70) at org.apache.spark.sql {code} Aha, it makes some sense to me because, for example, postgresql accepts this query; {code} postgres=# create table t(key INT, value INT); CREATE TABLE postgres=# insert into t values(1, 0); INSERT 0 1 postgres=# SELECT key AS key1 FROM t GROUP BY key1; key1 ------ 1 (1 row) {code} > alais cannot use in group by > ---------------------------- > > Key: SPARK-19778 > URL: https://issues.apache.org/jira/browse/SPARK-19778 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 2.1.0 > Reporter: xukun > > not support “select key as key1 from src group by key1” -- This message was sent by Atlassian JIRA (v6.3.15#6346) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org