GitHub user felixcheung opened a pull request: https://github.com/apache/spark/pull/10652
[SPARK-12699][SPARKR] R driver process should start in a clean state Currently we have R worker process launched with the --vanilla option that brings it up in a clean state (without init profile or workspace data, https://stat.ethz.ch/R-manual/R-devel/library/base/html/Startup.html). However, the R process for the Spark driver is not. We should do that because 1. That would make driver consistent with the worker process in R - for instance, a library would not be load in driver but not worker 2. Since SparkR depends on .libPath and .First() it could be broken by something in the user workspace, for example Here are the changes proposed: 1. When starting `sparkR` shell (except: allow save/restore workspace, since the driver/shell is local) 2. When launching R driver in cluster mode 3. In cluster mode, when calling R to install shipped R package This is discussed in PR #10171 @shivaram @sun-rui You can merge this pull request into a Git repository by running: $ git pull https://github.com/felixcheung/spark rvanilla Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/10652.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #10652 ---- commit c3488c9eda1f731c24769f20eb570d97e4aa5939 Author: felixcheung <felixcheun...@hotmail.com> Date: 2016-01-07T09:13:54Z add R command line options commit 24fee57e42beec3315979b8db4d817474bcd4baa Author: felixcheung <felixcheun...@hotmail.com> Date: 2016-01-07T22:40:50Z allow save/restore user workspace when running shell ---- --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org