GitHub user yanboliang opened a pull request:
https://github.com/apache/spark/pull/15131
[SPARK-17577][SparkR] SparkR support add files to Spark job and get by
executors
## What changes were proposed in this pull request?
Scala/Python users can add files to Spark job by submit options
```--files``` or ```SparkContext.addFile()```. Meanwhile, users can get the
added file by ```SparkFiles.get(filename)```.
We should also support this function for SparkR users, since SparkR users
may install third party R packages on each executors. For examples, SparkR
users can download third party R packages to driver firstly, add these files to
the Spark job by this API and each executor can install these packages by
```install.packages```.
## How was this patch tested?
Add unit test.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/yanboliang/spark spark-17577
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/15131.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #15131
----
commit d3dd3808e88b3f4ba5af683eb7d7709fcc2710f7
Author: Yanbo Liang <[email protected]>
Date: 2016-09-17T15:48:07Z
Fix typos
----
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]