Repository: spark Updated Branches: refs/heads/master f9ff75653 -> 88c826272
[SPARK-26010][R] fix vignette eval with Java 11 ## What changes were proposed in this pull request? changes in vignette only to disable eval ## How was this patch tested? Jenkins Author: Felix Cheung <felixcheun...@hotmail.com> Closes #23007 from felixcheung/rjavavervig. Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/88c82627 Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/88c82627 Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/88c82627 Branch: refs/heads/master Commit: 88c82627267a9731b2438f0cc28dd656eb3dc834 Parents: f9ff756 Author: Felix Cheung <felixcheun...@hotmail.com> Authored: Mon Nov 12 19:03:30 2018 -0800 Committer: Felix Cheung <felixche...@apache.org> Committed: Mon Nov 12 19:03:30 2018 -0800 ---------------------------------------------------------------------- R/pkg/vignettes/sparkr-vignettes.Rmd | 14 ++++++++++++++ 1 file changed, 14 insertions(+) ---------------------------------------------------------------------- http://git-wip-us.apache.org/repos/asf/spark/blob/88c82627/R/pkg/vignettes/sparkr-vignettes.Rmd ---------------------------------------------------------------------- diff --git a/R/pkg/vignettes/sparkr-vignettes.Rmd b/R/pkg/vignettes/sparkr-vignettes.Rmd index 7d924ef..f80b45b 100644 --- a/R/pkg/vignettes/sparkr-vignettes.Rmd +++ b/R/pkg/vignettes/sparkr-vignettes.Rmd @@ -57,6 +57,20 @@ First, let's load and attach the package. library(SparkR) ``` +```{r, include=FALSE} +# disable eval if java version not supported +override_eval <- tryCatch(!is.numeric(SparkR:::checkJavaVersion()), + error = function(e) { TRUE }, + warning = function(e) { TRUE }) + +if (override_eval) { + opts_hooks$set(eval = function(options) { + options$eval = FALSE + options + }) +} +``` + `SparkSession` is the entry point into SparkR which connects your R program to a Spark cluster. You can create a `SparkSession` using `sparkR.session` and pass in options such as the application name, any Spark packages depended on, etc. We use default settings in which it runs in local mode. It auto downloads Spark package in the background if no previous installation is found. For more details about setup, see [Spark Session](#SetupSparkSession). --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org