Repository: spark
Updated Branches:
  refs/heads/branch-1.5 561390dbc -> e56bcc638


[DOCS] [SQL] [PYSPARK] Fix typo in ntile function

Fix typo in ntile function.

Author: Moussa Taifi <[email protected]>

Closes #8261 from moutai/patch-2.

(cherry picked from commit 865a3df3d578c0442c97d749c81f554b560da406)
Signed-off-by: Sean Owen <[email protected]>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/e56bcc63
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/e56bcc63
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/e56bcc63

Branch: refs/heads/branch-1.5
Commit: e56bcc6381e6bf3e8940a83d45cdb1ff6e660a66
Parents: 561390d
Author: Moussa Taifi <[email protected]>
Authored: Wed Aug 19 09:42:41 2015 +0100
Committer: Sean Owen <[email protected]>
Committed: Wed Aug 19 09:42:50 2015 +0100

----------------------------------------------------------------------
 python/pyspark/sql/functions.py | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/e56bcc63/python/pyspark/sql/functions.py
----------------------------------------------------------------------
diff --git a/python/pyspark/sql/functions.py b/python/pyspark/sql/functions.py
index 41dfee9..4b74a50 100644
--- a/python/pyspark/sql/functions.py
+++ b/python/pyspark/sql/functions.py
@@ -531,7 +531,7 @@ def lead(col, count=1, default=None):
 def ntile(n):
     """
     Window function: returns the ntile group id (from 1 to `n` inclusive)
-    in an ordered window partition. Fow example, if `n` is 4, the first
+    in an ordered window partition. For example, if `n` is 4, the first
     quarter of the rows will get value 1, the second quarter will get 2,
     the third quarter will get 3, and the last quarter will get 4.
 


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to