techdocsmith commented on code in PR #15902:
URL: https://github.com/apache/druid/pull/15902#discussion_r1503374639
##########
docs/querying/sql-window-functions.md:
##########
@@ -88,40 +88,135 @@ You can use the OVER clause to treat other Druid
aggregation functions as window
Window functions support aliasing.
-## Define a window with the OVER clause
+## Window function syntax
+
+In general, window functions in Druid use the following syntax:
+
+
+```sql
+window_function() OVER (
+ [PARTITION BY partitioning expression]
+ [ORDER BY order expression]
+ [[ROWS, RANGE] BETWEEN range start AND range end])
+FROM table
+GROUP BY dimensions
+```
+
+```sql
+window_function() OVER w
+FROM table
+WINDOW w AS ([PARTITION BY ] [ORDER BY]
+ [[ROWS, RANGE] BETWEEN range start AND range end])
+GROUP BY dimensions
+```
The OVER clause defines the query windows for window functions as follows:
- PARTITION BY indicates the dimension that defines the rows within the window
-- ORDER BY specifies the order of the rows within the windows.
+- ORDER BY specifies the order of the rows within the windows
+
+An empty OVER clause or the absence of a PARTITION BY clause indicates that
all data belongs to a single window.
+
+For example, the following OVER clause example sets the window dimension to
`channel` and orders the results by the absolute value of `delta` ascending:
+
+```sql
+...
+RANK() OVER (PARTITION BY channel ORDER BY ABS(delta) ASC)
+...
+```
+
+Window frames, set in ROWS and RANGE expressions, limit the set of rows used
for the windowed aggregation. The general syntax is:
+
+ROWS AND RANGE accept the following values start and end :
+- UNBOUND PRECEDING - from the beginning of the partition as order by the
order expression
+- N ROWS PRECEDING - N rows before the current row as ordered by the order
expression
+- CURRENT ROW - the current row
+- N ROWS FOLLOWING - N rows after the current row as ordered by the order
expression
+- UNBOUNDED FOLLOWING - to the end of the partition as ordered by the order
expression
+
+See (Example with window frames)(#example-with-window-frames) for more detail.
+
+Druid applies the GROUP BY dimensions first before calculating all non-window
aggregation functions. Then it applies the window function over the aggregate
results.
:::note
Sometimes windows are called partitions. However, the partitioning for window
functions are a shuffle (partition) of the result set created at query time and
is not to be confused with Druid's segment partitioning feature which
partitions data at ingest time.
:::
-The following OVER clause example sets the window dimension to `channel` and
orders the results by the absolute value of `delta` ascending:
+### ORDER BY windows
+
+When the window definition only specifies ORDER BY , it sorts the aggregate
data set and applies the function in that order.
+
+The following query uses ORDER BY SUM(delta) DESC to rank user hourly activity
from the most changed the least changed within an hour:
```sql
-...
-RANK() OVER (PARTITION BY channel ORDER BY ABS(delta) ASC)
-...
+SELECT
+ TIME_FLOOR(__time, 'PT1H') as time_hour,
+ channel,
+ user,
+ SUM(delta) net_user_changes,
+ RANK( ) OVER ( ORDER BY SUM(delta) DESC ) editing_rank
+FROM "wikipedia"
+WHERE channel IN ('#kk.wikipedia', '#lt.wikipedia')
+ AND __time BETWEEN '2016-06-27' AND '2016-06-28'
+GROUP BY TIME_FLOOR(__time, 'PT1H'), channel, user
+ORDER BY 5
```
+### PARTITION BY windows
+
+When a window only specifies PARTITION BY partition expression, Druid
calculates the aggregate window function over all the rows that share a values
within the selected dataset.
Review Comment:
```suggestion
When a window only specifies PARTITION BY partition expression, Druid
calculates the aggregate window function over all the rows that share a value
within the selected dataset.
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]