[ https://issues.apache.org/jira/browse/SPARK-10777?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14954463#comment-14954463 ]
kevin yu commented on SPARK-10777: ---------------------------------- Hello Campbell: I tried the same query on hive version 1.2.1, and I got the same failure. It looks like hive or spark sql are not supporting this yet. @a : Hello, I am new to spark, and I wish to contribute to the spark community. I can recreate the problem on hive and spark sql. I think spark sql and hive doesn't have the support yet. Should I look into how to add this support in spark sql or there is already other plan for this? Thanks for your advice. Kevin > order by fails when column is aliased and projection includes windowed > aggregate > -------------------------------------------------------------------------------- > > Key: SPARK-10777 > URL: https://issues.apache.org/jira/browse/SPARK-10777 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 1.5.0 > Reporter: N Campbell > > This statement fails in SPARK (works fine in ORACLE, DB2 ....) > select r as c1, min ( s ) over () as c2 from > ( select rnum r, sum ( cint ) s from certstring.tint group by rnum ) t > order by r > Error: org.apache.spark.sql.AnalysisException: cannot resolve 'r' given input > columns c1, c2; line 3 pos 9 > SQLState: null > ErrorCode: 0 > Forcing the aliased column name works around the defect > select r as c1, min ( s ) over () as c2 from > ( select rnum r, sum ( cint ) s from certstring.tint group by rnum ) t > order by c1 > These work fine > select r as c1, min ( s ) over () as c2 from > ( select rnum r, sum ( cint ) s from certstring.tint group by rnum ) t > order by c1 > select r as c1, s as c2 from > ( select rnum r, sum ( cint ) s from certstring.tint group by rnum ) t > order by r > create table if not exists TINT ( RNUM int , CINT int ) > ROW FORMAT DELIMITED FIELDS TERMINATED BY '|' LINES TERMINATED BY '\n' > STORED AS ORC ; -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org