Github user adrian555 commented on a diff in the pull request: https://github.com/apache/spark/pull/22455#discussion_r219938566 --- Diff: R/pkg/R/DataFrame.R --- @@ -244,11 +246,25 @@ setMethod("showDF", #' @note show(SparkDataFrame) since 1.4.0 setMethod("show", "SparkDataFrame", function(object) { - cols <- lapply(dtypes(object), function(l) { - paste(l, collapse = ":") - }) - s <- paste(cols, collapse = ", ") - cat(paste(class(object), "[", s, "]\n", sep = "")) + if (identical(sparkR.conf("spark.sql.repl.eagerEval.enabled", "false")[[1]], "true")) { --- End diff -- Retrieving all conf through `sparkR.conf()` does not provide a default value for each conf, so extra condition check is needed. Ideally, I believe we should have cached all configuration for the session inside the sparkr env to avoid backend calls at all. Anyway, I made the suggested change.
--- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org