Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/17865
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r125536602
--- Diff: R/pkg/R/functions.R ---
@@ -1900,8 +1901,8 @@ setMethod("year",
#' @details
#' \code{atan2}: Returns the angle theta from the c
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r125518551
--- Diff: R/pkg/R/functions.R ---
@@ -148,7 +148,8 @@ setMethod("asin",
#' atan
#'
-#' Computes the tangent inverse of the given valu
Github user ueshin commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r125201828
--- Diff: python/pyspark/sql/functions.py ---
@@ -1073,12 +1108,17 @@ def last_day(date):
return Column(sc._jvm.functions.last_day(_to_java_column(da
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r123964695
--- Diff: python/pyspark/sql/functions.py ---
@@ -1116,12 +1160,12 @@ def from_utc_timestamp(timestamp, tz):
@since(1.5)
def to_utc_timestamp(t
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r123962727
--- Diff: python/pyspark/sql/functions.py ---
@@ -200,17 +225,20 @@ def _():
@since(1.3)
def approxCountDistinct(col, rsd=None):
"""
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r123965739
--- Diff: python/pyspark/sql/functions.py ---
@@ -473,10 +503,15 @@ def rand(seed=None):
return Column(jc)
+@ignore_unicode_pref
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r123967003
--- Diff: python/pyspark/sql/functions.py ---
@@ -95,10 +100,13 @@ def _():
'0.0 through pi.',
'asin': 'Computes the sine inver
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r123963627
--- Diff: python/pyspark/sql/functions.py ---
@@ -282,8 +309,7 @@ def corr(col1, col2):
@since(2.0)
def covar_pop(col1, col2):
-"
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r123964972
--- Diff: python/pyspark/sql/functions.py ---
@@ -1258,7 +1302,7 @@ def hash(*cols):
'uppercase. Words are delimited by whitespace.',
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r123962284
--- Diff: R/pkg/R/functions.R ---
@@ -426,7 +426,7 @@ setMethod("covar_pop", signature(col1 =
"characterOrColumn", col2 = "characterOr
#' cos
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r123960210
--- Diff: R/pkg/R/functions.R ---
@@ -426,7 +426,7 @@ setMethod("covar_pop", signature(col1 =
"characterOrColumn", col2 = "characterOr
#' cos
Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r123941351
--- Diff: R/pkg/R/functions.R ---
@@ -426,7 +426,7 @@ setMethod("covar_pop", signature(col1 =
"characterOrColumn", col2 = "characterOr
#' cos
Github user map222 commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r123912061
--- Diff: python/pyspark/sql/functions.py ---
@@ -969,8 +1005,8 @@ def months_between(date1, date2):
"""
Returns the number of months betwee
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r123245717
--- Diff: python/pyspark/sql/functions.py ---
@@ -200,17 +226,20 @@ def _():
@since(1.3)
def approxCountDistinct(col, rsd=None):
"""
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r123184803
--- Diff: python/pyspark/sql/functions.py ---
@@ -969,8 +1005,8 @@ def months_between(date1, date2):
"""
Returns the number of months b
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r123182922
--- Diff: python/pyspark/sql/functions.py ---
@@ -67,9 +67,15 @@ def _():
_.__doc__ = 'Window function: ' + doc
return _
+_li
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r123182491
--- Diff: python/pyspark/sql/functions.py ---
@@ -109,15 +118,33 @@ def _():
'rint': 'Returns the double value that is closest in value to the
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r123184373
--- Diff: python/pyspark/sql/functions.py ---
@@ -267,8 +296,7 @@ def coalesce(*cols):
@since(1.6)
def corr(col1, col2):
-"""Retu
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r123186254
--- Diff: python/pyspark/sql/functions.py ---
@@ -312,7 +338,7 @@ def covar_samp(col1, col2):
@since(1.3)
def countDistinct(col, *cols):
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r123179950
--- Diff: python/pyspark/sql/functions.py ---
@@ -147,7 +173,7 @@ def _():
# math functions that take two arguments as input
_binary_mathfuncti
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r123183863
--- Diff: python/pyspark/sql/functions.py ---
@@ -200,17 +226,20 @@ def _():
@since(1.3)
def approxCountDistinct(col, rsd=None):
"""
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r123184653
--- Diff: python/pyspark/sql/functions.py ---
@@ -793,9 +824,9 @@ def date_format(date, format):
.. note:: Use when ever possible specialized fu
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r123185008
--- Diff: python/pyspark/sql/functions.py ---
@@ -1105,9 +1150,9 @@ def from_utc_timestamp(timestamp, tz):
Given a timestamp, which corresponds
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r123184878
--- Diff: python/pyspark/sql/functions.py ---
@@ -1073,12 +1109,17 @@ def last_day(date):
return Column(sc._jvm.functions.last_day(_to_java_colu
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r123180524
--- Diff: python/pyspark/sql/functions.py ---
@@ -1258,7 +1303,7 @@ def hash(*cols):
'uppercase. Words are delimited by whitespace.',
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r121861352
--- Diff: python/pyspark/sql/functions.py ---
@@ -962,9 +993,9 @@ def add_months(start, months):
"""
Returns the date that is `months`
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r121861095
--- Diff: python/pyspark/sql/functions.py ---
@@ -1254,23 +1294,41 @@ def hash(*cols):
# -- String/Binary functions
-
Github user map222 commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r121860425
--- Diff: python/pyspark/sql/functions.py ---
@@ -1254,23 +1294,41 @@ def hash(*cols):
# -- String/Binary functions
--
Github user map222 commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r121860217
--- Diff: python/pyspark/sql/functions.py ---
@@ -962,9 +993,9 @@ def add_months(start, months):
"""
Returns the date that is `months` month
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r121860061
--- Diff: python/pyspark/sql/functions.py ---
@@ -189,15 +210,15 @@ def _():
}
for _name, _doc in _functions.items():
-globals()[
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r121859623
--- Diff: python/pyspark/sql/functions.py ---
@@ -189,15 +210,15 @@ def _():
}
for _name, _doc in _functions.items():
-globals()[
Github user map222 commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r121857727
--- Diff: python/pyspark/sql/functions.py ---
@@ -92,14 +98,16 @@ def _():
_functions_1_4 = {
# unary math functions
'acos': 'Computes
Github user map222 commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r121857687
--- Diff: python/pyspark/sql/functions.py ---
@@ -189,15 +210,15 @@ def _():
}
for _name, _doc in _functions.items():
-globals()[_name
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r121543259
--- Diff: python/pyspark/sql/functions.py ---
@@ -962,9 +993,9 @@ def add_months(start, months):
"""
Returns the date that is `months`
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r121537735
--- Diff: python/pyspark/sql/functions.py ---
@@ -92,14 +98,16 @@ def _():
_functions_1_4 = {
# unary math functions
'acos': 'Comp
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r121543879
--- Diff: python/pyspark/sql/functions.py ---
@@ -1254,23 +1294,41 @@ def hash(*cols):
# -- String/Binary functions
-
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r121539157
--- Diff: python/pyspark/sql/functions.py ---
@@ -189,15 +210,15 @@ def _():
}
for _name, _doc in _functions.items():
-globals()[
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r121539617
--- Diff: python/pyspark/sql/functions.py ---
@@ -206,17 +227,20 @@ def _():
@since(1.3)
def approxCountDistinct(col, rsd=None):
"""
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r121542536
--- Diff: python/pyspark/sql/functions.py ---
@@ -793,9 +824,9 @@ def date_format(date, format):
.. note:: Use when ever possible specialized fu
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r121536038
--- Diff: python/pyspark/sql/functions.py ---
@@ -92,14 +98,16 @@ def _():
_functions_1_4 = {
# unary math functions
'acos': 'Comp
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r121539034
--- Diff: python/pyspark/sql/functions.py ---
@@ -189,15 +210,15 @@ def _():
}
for _name, _doc in _functions.items():
-globals()[
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r121541415
--- Diff: python/pyspark/sql/functions.py ---
@@ -206,17 +227,20 @@ def _():
@since(1.3)
def approxCountDistinct(col, rsd=None):
"""
Github user ueshin commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r121493879
--- Diff: python/pyspark/sql/functions.py ---
@@ -466,10 +487,15 @@ def nanvl(col1, col2):
return Column(sc._jvm.functions.nanvl(_to_java_column(col1
Github user ueshin commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r121491632
--- Diff: python/pyspark/sql/functions.py ---
@@ -109,15 +117,29 @@ def _():
'rint': 'Returns the double value that is closest in value to the
argum
Github user ueshin commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r121495497
--- Diff: python/pyspark/sql/functions.py ---
@@ -1254,23 +1294,41 @@ def hash(*cols):
# -- String/Binary functions
--
Github user ueshin commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r121491673
--- Diff: python/pyspark/sql/functions.py ---
@@ -109,15 +117,29 @@ def _():
'rint': 'Returns the double value that is closest in value to the
argum
Github user ueshin commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r121495462
--- Diff: python/pyspark/sql/functions.py ---
@@ -1254,23 +1294,41 @@ def hash(*cols):
# -- String/Binary functions
--
Github user ueshin commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r121494100
--- Diff: python/pyspark/sql/functions.py ---
@@ -479,10 +505,15 @@ def rand(seed=None):
return Column(jc)
+@ignore_unicode_prefix
Github user ueshin commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r121483327
--- Diff: python/pyspark/sql/functions.py ---
@@ -67,9 +67,15 @@ def _():
_.__doc__ = 'Window function: ' + doc
return _
+_lit_doc
Github user ueshin commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r121495519
--- Diff: python/pyspark/sql/functions.py ---
@@ -1254,23 +1294,41 @@ def hash(*cols):
# -- String/Binary functions
--
Github user map222 commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r116550583
--- Diff: python/pyspark/sql/functions.py ---
@@ -153,7 +173,7 @@ def _():
# math functions that take two arguments as input
_binary_mathfunctions =
Github user map222 commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r115889357
--- Diff: python/pyspark/sql/functions.py ---
@@ -153,7 +173,7 @@ def _():
# math functions that take two arguments as input
_binary_mathfunctions =
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r115415748
--- Diff: python/pyspark/sql/functions.py ---
@@ -1120,12 +1159,12 @@ def from_utc_timestamp(timestamp, tz):
@since(1.5)
def to_utc_timestamp(t
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r115415714
--- Diff: python/pyspark/sql/functions.py ---
@@ -1120,12 +1159,12 @@ def from_utc_timestamp(timestamp, tz):
@since(1.5)
def to_utc_timestamp(t
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r115415128
--- Diff: python/pyspark/sql/functions.py ---
@@ -153,7 +173,7 @@ def _():
# math functions that take two arguments as input
_binary_mathfuncti
Github user map222 commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r115404673
--- Diff: python/pyspark/sql/functions.py ---
@@ -153,7 +173,7 @@ def _():
# math functions that take two arguments as input
_binary_mathfunctions =
Github user map222 commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r115385649
--- Diff: python/pyspark/sql/functions.py ---
@@ -1120,12 +1159,12 @@ def from_utc_timestamp(timestamp, tz):
@since(1.5)
def to_utc_timestamp(timest
Github user map222 commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r115385050
--- Diff: python/pyspark/sql/functions.py ---
@@ -793,8 +824,8 @@ def date_format(date, format):
.. note:: Use when ever possible specialized functio
Github user gatorsmile commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r115134581
--- Diff: python/pyspark/sql/functions.py ---
@@ -409,7 +432,7 @@ def isnan(col):
@since(1.6)
def isnull(col):
-"""An expression t
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r114930599
--- Diff: python/pyspark/sql/functions.py ---
@@ -910,8 +941,8 @@ def weekofyear(col):
"""
Extract the week number of a given date as i
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r114929597
--- Diff: python/pyspark/sql/functions.py ---
@@ -206,17 +226,20 @@ def _():
@since(1.3)
def approxCountDistinct(col, rsd=None):
"""
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r114929441
--- Diff: python/pyspark/sql/functions.py ---
@@ -153,7 +173,7 @@ def _():
# math functions that take two arguments as input
_binary_mathfuncti
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r114929075
--- Diff: python/pyspark/sql/functions.py ---
@@ -131,9 +152,8 @@ def _():
'var_pop': 'Aggregate function: returns the population variance of
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r114929803
--- Diff: python/pyspark/sql/functions.py ---
@@ -1120,12 +1159,12 @@ def from_utc_timestamp(timestamp, tz):
@since(1.5)
def to_utc_timestamp(t
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r114929993
--- Diff: python/pyspark/sql/functions.py ---
@@ -67,9 +67,16 @@ def _():
_.__doc__ = 'Window function: ' + doc
return _
+_li
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r114929689
--- Diff: python/pyspark/sql/functions.py ---
@@ -456,7 +479,7 @@ def monotonically_increasing_id():
def nanvl(col1, col2):
"""Returns col1
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r114930366
--- Diff: python/pyspark/sql/functions.py ---
@@ -793,8 +824,8 @@ def date_format(date, format):
.. note:: Use when ever possible specialized fu
Github user HyukjinKwon commented on a diff in the pull request:
https://github.com/apache/spark/pull/17865#discussion_r114929646
--- Diff: python/pyspark/sql/functions.py ---
@@ -397,7 +420,7 @@ def input_file_name():
@since(1.6)
def isnan(col):
-"""An expre
GitHub user map222 opened a pull request:
https://github.com/apache/spark/pull/17865
[SPARK-20456][Docs] Add examples for functions collection for pyspark
## What changes were proposed in this pull request?
This adds documentation to many functions in pyspark.sql.functions.p
70 matches
Mail list logo