MichaelChirico commented on a change in pull request #28350:
URL: https://github.com/apache/spark/pull/28350#discussion_r415364659



##########
File path: R/pkg/R/DataFrame.R
##########
@@ -1669,29 +1667,43 @@ setMethod("dapplyCollect",
 #' @aliases gapply,SparkDataFrame-method
 #' @rdname gapply
 #' @name gapply
+#' @details
+#' \code{func} is a function of two arguments. The first, usually named 
\code{key}
+#' (though this is not enforced) corresponds to the grouping key, will be a 
\code{list}
+#' of \code{length(cols)} length-one objects corresponding to the grouping 
columns' values
+#' for the current group.
+#'
+#' The second, herein \code{x}, will be a local \code{\link{data.frame}} with 
the
+#' columns of the input not in \code{cols} for the rows corresponding to 
\code{key}.
+#'
+#' The output of \code{func} must be a \code{data.frame} matching 
\code{schema} --
+#' in particular this means the names of the output \code{data.frame} are 
irrelevant
+#'
 #' @seealso \link{gapplyCollect}
 #' @examples
 #'
 #' \dontrun{
-#' Computes the arithmetic mean of the second column by grouping
-#' on the first and third columns. Output the grouping values and the average.
+#' # Computes the arithmetic mean of the second column by grouping

Review comment:
       Even though this code in `\dontrun{}` won't be evaluated by CRAN 
machines, still end users might want to copy-paste the code directly & expect 
it to be valid R code (this is the point of the `run.dontrun` argument for 
`example()`).
   
   So I added the comment hash here in addition to the change describing `func` 
detailed above.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to