Github user phegstrom commented on a diff in the pull request:
https://github.com/apache/spark/pull/22227#discussion_r214379104
--- Diff: R/pkg/R/functions.R ---
@@ -3410,13 +3410,14 @@ setMethod("collect_set",
#' \dontrun{
#' head(select(df, split_string(df$Sex, "a")))
#' head(select(df, split_string(df$Class, "\\d")))
+#' head(select(df, split_string(df$Class, "\\d", 2)))
#' # This is equivalent to the following SQL expression
#' head(selectExpr(df, "split(Class, '\\\\d')"))}
#' @note split_string 2.3.0
setMethod("split_string",
signature(x = "Column", pattern = "character"),
- function(x, pattern) {
- jc <- callJStatic("org.apache.spark.sql.functions", "split",
x@jc, pattern)
+ function(x, pattern, limit = -1) {
+ jc <- callJStatic("org.apache.spark.sql.functions", "split",
x@jc, pattern, limit)
--- End diff --
@felixcheung what's the best way to run a single unit test group with
testthat in this repo? spark docs only point to ./run-tests.sh which runs all R
unit tests.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]