rxin commented on code in PR #39134:
URL: https://github.com/apache/spark/pull/39134#discussion_r1053746714


##########
sql/core/src/test/resources/sql-tests/inputs/group-by-all.sql:
##########
@@ -0,0 +1,72 @@
+-- group by all
+-- see 
https://www.linkedin.com/posts/mosha_duckdb-firebolt-snowflake-activity-7009615821006131200-VQ0o
+
+create temporary view data as select * from values
+  ("USA", "San Francisco", "Reynold", 1, 11.0),
+  ("USA", "San Francisco", "Matei", 2, 12.0),
+  ("USA", "Berkeley", "Xiao", 3, 13.0),
+  ("China", "Hangzhou", "Wenchen", 4, 14.0),
+  ("China", "Shanghai", "Shanghaiese", 5, 15.0),
+  ("Korea", "Seoul", "Hyukjin", 6, 16.0),
+  ("UK", "London", "Sean", 7, 17.0)
+  as data(country, city, name, id, power);
+
+-- basic
+select country, count(*) from data group by ALL;
+
+-- different case
+select country, count(*) from data group by aLl;
+
+-- a column named "all" would still work
+select all, city, count(*) from (select country as all, city, id from data) 
group by all, city;
+
+-- a column named "all" should take precedence over the normal group by all 
expansion
+-- if the "group by all" is expanded to refer to the substr, then we'd have 
only 3 rows in output,
+-- because "USA" and "UK" both start with "U". if the "group by all" is 
referring to the all
+-- column, the output would have 4 rows.
+select substr(all, 0, 1), count(*) from (select country as all, id from data) 
group by all;

Review Comment:
   @cloud-fan i think this case happens to work today because ResolveGroupByAll 
runs after ResolveReferences. But since they are in the same batch, and 
ResolveReference doesn't run until fix point, is it possible that this won't 
work for more complex cases? 
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to