This is an automated email from the ASF dual-hosted git repository.
gurwls223 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new 68496c1 [SPARK-26451][SQL] Change lead/lag argument name from count
to offset
68496c1 is described below
commit 68496c1af310aadfb1b226cb05be510252769d43
Author: deepyaman <[email protected]>
AuthorDate: Fri Dec 28 00:02:41 2018 +0800
[SPARK-26451][SQL] Change lead/lag argument name from count to offset
## What changes were proposed in this pull request?
Change aligns argument name with that in Scala version and documentation.
## How was this patch tested?
(Please explain how this patch was tested. E.g. unit tests, integration
tests, manual tests)
(If this patch involves UI changes, please attach a screenshot; otherwise,
remove this)
Please review http://spark.apache.org/contributing.html before opening a
pull request.
Closes #23357 from deepyaman/patch-1.
Authored-by: deepyaman <[email protected]>
Signed-off-by: Hyukjin Kwon <[email protected]>
---
python/pyspark/sql/functions.py | 12 ++++++------
1 file changed, 6 insertions(+), 6 deletions(-)
diff --git a/python/pyspark/sql/functions.py b/python/pyspark/sql/functions.py
index d2a771e..3c33e2b 100644
--- a/python/pyspark/sql/functions.py
+++ b/python/pyspark/sql/functions.py
@@ -798,7 +798,7 @@ def factorial(col):
# --------------- Window functions ------------------------
@since(1.4)
-def lag(col, count=1, default=None):
+def lag(col, offset=1, default=None):
"""
Window function: returns the value that is `offset` rows before the
current row, and
`defaultValue` if there is less than `offset` rows before the current row.
For example,
@@ -807,15 +807,15 @@ def lag(col, count=1, default=None):
This is equivalent to the LAG function in SQL.
:param col: name of column or expression
- :param count: number of row to extend
+ :param offset: number of row to extend
:param default: default value
"""
sc = SparkContext._active_spark_context
- return Column(sc._jvm.functions.lag(_to_java_column(col), count, default))
+ return Column(sc._jvm.functions.lag(_to_java_column(col), offset, default))
@since(1.4)
-def lead(col, count=1, default=None):
+def lead(col, offset=1, default=None):
"""
Window function: returns the value that is `offset` rows after the current
row, and
`defaultValue` if there is less than `offset` rows after the current row.
For example,
@@ -824,11 +824,11 @@ def lead(col, count=1, default=None):
This is equivalent to the LEAD function in SQL.
:param col: name of column or expression
- :param count: number of row to extend
+ :param offset: number of row to extend
:param default: default value
"""
sc = SparkContext._active_spark_context
- return Column(sc._jvm.functions.lead(_to_java_column(col), count, default))
+ return Column(sc._jvm.functions.lead(_to_java_column(col), offset,
default))
@since(1.4)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]