Github user mengxr commented on a diff in the pull request:

    https://github.com/apache/spark/pull/14761#discussion_r75797308
  
    --- Diff: R/pkg/R/utils.R ---
    @@ -697,3 +697,20 @@ is_master_local <- function(master) {
     is_sparkR_shell <- function() {
       grepl(".*shell\\.R$", Sys.getenv("R_PROFILE_USER"), perl = TRUE)
     }
    +
    +instructionForInstall <- function(mode) {
    +  if (mode == "remote") {
    +    paste0("Connecting to a remote Spark master. ",
    +           "Please make sure Spark package is also installed in this 
machine.\n",
    +           "- If there is one, set the path in sparkHome parameter or ",
    +           "environment variable SPARK_HOME.\n",
    +           "- If not, you may run install.spark function to do the job. ",
    +           "Please make sure the Spark and the Hadoop versions ",
    +           "match the versions on the cluster. ",
    +           "Currently only Spark 2.0 is supported. ",
    --- End diff --
    
    SparkR version should the same as the Spark version. So instead of saying 
`only Spark 2.0`, we can say the current `SparkR package is compatible with 
Spark x.x.x`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to