Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/14761#discussion_r75921626
--- Diff: R/pkg/R/sparkR.R ---
@@ -550,3 +532,27 @@ processSparkPackages <- function(packages) {
}
splittedPackages
}
+
+# Utility function that checks and install Spark to local folder if not
found
+#
+# Installation will not be triggered if it's called from sparkR shell
+# or if the master url is not local
+sparkLocalInstall <- function(sparkHome, master) {
+ if (!isSparkRShell()) {
+ if (!is.na(file.info(sparkHome)$isdir)) {
+ msg <- paste0("Spark package found in SPARK_HOME: ", sparkHome)
+ message(msg)
+ } else {
+ if (!nzchar(master) || isMasterLocal(master)) {
+ msg <- paste0("Spark not found in SPARK_HOME: ",
+ sparkHome)
+ message(msg)
+ packageLocalDir <- install.spark()
+ } else {
+ msg <- paste0("Spark not found in SPARK_HOME: ",
+ sparkHome, "\n", installInstruction("remote"))
+ stop(msg)
+ }
+ }
+ }
--- End diff --
nit: perhaps be more specific what's being returned?
also it could help to have @param / @return in the comment block above this
function - no need for `#'` since we are not publishing it.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]