This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new d371180  [MINOR][R] Deduplicate RStudio setup documentation
d371180 is described below

commit d371180c01bf68ed4e5f88df836c7f2fb27a46d3
Author: Hyukjin Kwon <gurwls...@apache.org>
AuthorDate: Wed Jan 2 08:04:36 2019 +0800

    [MINOR][R] Deduplicate RStudio setup documentation
    
    ## What changes were proposed in this pull request?
    
    This PR targets to deduplicate RStudio setup for SparkR.
    
    ## How was this patch tested?
    
    N/A
    
    Closes #23421 from HyukjinKwon/minor-doc.
    
    Authored-by: Hyukjin Kwon <gurwls...@apache.org>
    Signed-off-by: Hyukjin Kwon <gurwls...@apache.org>
---
 R/README.md | 10 +---------
 1 file changed, 1 insertion(+), 9 deletions(-)

diff --git a/R/README.md b/R/README.md
index d77a1ec..e238a0e 100644
--- a/R/README.md
+++ b/R/README.md
@@ -39,15 +39,7 @@ To set other options like driver memory, executor memory 
etc. you can pass in th
 
 #### Using SparkR from RStudio
 
-If you wish to use SparkR from RStudio or other R frontends you will need to 
set some environment variables which point SparkR to your Spark installation. 
For example
-```R
-# Set this to where Spark is installed
-Sys.setenv(SPARK_HOME="/Users/username/spark")
-# This line loads SparkR from the installed directory
-.libPaths(c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib"), .libPaths()))
-library(SparkR)
-sparkR.session()
-```
+If you wish to use SparkR from RStudio, please refer [SparkR 
documentation](https://spark.apache.org/docs/latest/sparkr.html#starting-up-from-rstudio).
 
 #### Making changes to SparkR
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to