maropu commented on a change in pull request #28220: [SPARK-31390][SQL][DOCS] 
Document Window Function in SQL Syntax Section
URL: https://github.com/apache/spark/pull/28220#discussion_r408723154
 
 

 ##########
 File path: docs/sql-ref-syntax-qry-window.md
 ##########
 @@ -19,4 +19,222 @@ license: |
   limitations under the License.
 ---
 
-**This page is under construction**
+### Description
+
+Similarly to aggregate functions, window functions operate on a group of rows. 
However, unlike aggregate functions, window functions perform aggregation 
without reducing, calculating a return value for each row in the group. Window 
functions are useful for processing tasks such as calculating a moving average, 
computing a cumulative, or accessing the value of rows given the relative 
position of the current row. Spark SQL supports three types of window functions:
+  * Ranking Functions
+  * Analytic Functions
+  * Aggregate Functions
+
+### How to Use Window Functions
+
+  * Mark a function as window function by using `over`.
+    - SQL: Add an OVER clause after the window function, e.g. rank ( ) OVER ( 
... )
+    - DataFrame API: Call the window function's `over` method, e.g. rank ( 
).over ( ... )
+  * Define the window specification associated with this function. A window 
specification includes partitioning specification, ordering specification, and 
frame specification.
+    - Partitioning Specification:
+      - SQL: PARTITION BY
+      - DataFrame API: Window.partitionBy ( ... )
+    - Ordering Specification:
+      - SQL: ORDER BY
+      - DataFrame API: Window.orderBy ( ... )
+    - Frame Specification:
+      - SQL: ROWS ( for ROW frame ), RANGE ( for RANGE frame )
+      - DataFrame API: WindowSpec.rowsBetween ( for row frame ), 
WindowSpec.rangeBetween ( for range frame )
+
+### Syntax
+
+{% highlight sql %}
+window_function OVER ( [ partition_spec ] order_spec [ window_frame ] )
+{% endhighlight %}
+
+### Parameters
+
+<dl>
+  <dt><code><em>window_function</em></code></dt>
+  <dd>
+    <ul>
+      <li> Ranking Functions </li>
+      <br>
+      <b>Syntax:</b>
+        <code>
+          RANK | DENSE_RANK | PERCENT_RANK | NTILE | ROW_NUMBER
+        </code>
+    </ul>
+    <ul>
+      <li> Analytic Functions </li>
+      <br>
+      <b>Syntax:</b>
+        <code>
+          CUME_DIST | LAG | LEAD
+        </code>
+    </ul>
+    <ul>
+      <li> Aggregate Functions </li>
+      <br>
+      <b>Syntax:</b>
+        <code>
+          MAX | MIN | COUNT | SUM | AVG | ...
+        </code>
+        <br>
+        Please refer <a href="api/sql/">here</a> for a complete list of Spark 
Aggregate Functions.
+    </ul>
+  </dd>
+</dl>
+<dl>
+  <dt><code><em>partition_spec</em></code></dt>
+  <dd>
+    Specifies a comma separated list of key and value pairs for 
partitions.<br><br>
+    <b>Syntax:</b><br>
+      <code>
+        { PARTITION | DISTRIBUTE } BY partition_col_name  = partition_col_val 
( [ , ... ] )
+      </code>
+  </dd>
+</dl>
+<dl>
+  <dt><code><em>order_spec</em></code></dt>
+  <dd>
+    Specifies an ordering of the rows.<br><br>
+    <b>Syntax:</b><br>
+      <code>
+        { ORDER | SORT } BY { expression [ ASC | DESC ] [ NULLS { FIRST | LAST 
} ] [ , ... ] }
+      </code>
+  </dd>
+</dl>
+<dl>
+  <dt><code><em>window_frame</em></code></dt>
+  <dd>
+    Specifies which row to start the window on and where to end it.<br><br>
+    <b>Syntax:</b><br>
+      <code>
+        { RANGE | ROWS } [ BETWEEN ]
+          UNBOUNDED { PRECEDING | FOLLOWING }
+          | CURRENT ROW
+          | boolean_expression { PRECEDING | FOLLOWING }
+      </code> <br><br>
+      <code>boolean_expression</code><br>
+      Specifies an expression with a return type of boolean.
+  </dd>
+</dl>
 
 Review comment:
   nit: add a single blank.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to