Github user felixcheung commented on a diff in the pull request:

    https://github.com/apache/spark/pull/11336#discussion_r83927478
  
    --- Diff: R/pkg/R/DataFrame.R ---
    @@ -1166,26 +1166,33 @@ setMethod("take",
                 collect(limited)
               })
     
    -#' Head
    -#'
    -#' Return the first \code{num} rows of a SparkDataFrame as a R data.frame. 
If \code{num} is not
    -#' specified, then head() returns the first 6 rows as with R data.frame.
    -#'
    -#' @param x a SparkDataFrame.
    +#' Return the first part of a SparkDataFrame or Column
    +#' 
    +#' If \code{x} is a SparkDataFrame, its first 
    +#' rows will be returned as a data.frame. If the dataset is a 
\code{Column}, its first 
    +#' elements will be returned as a vector. The number of elements to be 
returned
    +#' is given by parameter \code{num}. Default value for \code{num} is 6.
    +#' @param x a SparkDataFrame or Column
     #' @param num the number of rows to return. Default is 6.
     #' @return A data.frame.
     #'
     #' @family SparkDataFrame functions
     #' @aliases head,SparkDataFrame-method
    -#' @rdname head
     #' @name head
     #' @export
     #' @examples
     #'\dontrun{
    +#' # Initialize Spark context and SQL context
     #' sparkR.session()
    -#' path <- "path/to/file.json"
    -#' df <- read.json(path)
    -#' head(df)
    +#' 
    +#' # Create a DataFrame from the Iris dataset
    +#' irisDF <- as.DataFrame(iris)
    --- End diff --
    
    you might want to avoid using `iris` in example, because it will cause a 
warning with the column name having `.` and if at one point we fix this to 
support `.` in column name, the example will need to be updated


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to