Github user dongjoon-hyun commented on a diff in the pull request: https://github.com/apache/spark/pull/12426#discussion_r60977738 --- Diff: R/pkg/R/context.R --- @@ -226,6 +226,47 @@ setCheckpointDir <- function(sc, dirName) { invisible(callJMethod(sc, "setCheckpointDir", suppressWarnings(normalizePath(dirName)))) } +#' @title Run a function over a list of elements, distributing the computations with Spark. +#' +#' @description +#' Applies a function in a manner that is similar to doParallel or lapply to elements of a list. +#' The computations are distributed using Spark. It is conceptually the same as the following code: +#' unlist(lapply(list, func)) +#' +#' Known limitations: +#' - variable scoping and capture: compared to R's rich support for variable resolutions, the +# distributed nature of SparkR limits how variables are resolved at runtime. All the variables +# that are available through lexical scoping are embedded in the closure of the function and +# available as read-only variables within the function. The environment variables should be +# stored into temporary variables outside the function, and not directly accessed within the +# function. +#' +#' - loading external packages: In order to use a package, you need to load it inside the +#' closure. For example, if you rely on the MASS module, here is how you would use it: +#' +#'\dontrun{ +#' train <- function(hyperparam) { +#' library(MASS) +#' lm.ridge(ây ~ x+zâ, data, lambda=hyperparam) +#' model +#' } +#'} +#' +#' @param list the list of elements +#' @param func a function that takes one argument. +#' @examples +#' Here is a trivial example that double the values in a list +#'\dontrun{ +#' doubled <- sparkLapply(1:10, function(x){2 * x}) --- End diff -- Here, too.
--- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org