GitHub user msannell opened a pull request:

    https://github.com/apache/spark/pull/6557

    [SPARK-8019] [SPARKR] Support SparkR spawning worker R processes with a 
command other then Rscript

    This is a simple change to add a new environment variable
    "spark.sparkr.r.command" that specifies the command that SparkR will
    use when creating an R engine process.  If this is not specified,
    "Rscript" will be used by default.
    
    I did not add any documentation, since I couldn't find any place where
    environment variables (such as "spark.sparkr.use.daemon") are
    documented.
    
    I also did not add a unit test.  The only test that would work
    generally would be one starting SparkR with
    sparkR.init(sparkEnvir=list(spark.sparkr.r.command="Rscript")), just
    using the default value.  I think that this is a low-risk change.
    
    Likely committers: @shivaram 

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/msannell/spark altR

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/6557.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #6557
    
----
commit 7eac1427392b1039823e80a281cc6dcb7ad160db
Author: Michael Sannella x268 <[email protected]>
Date:   2015-06-01T19:14:33Z

    add spark.sparkr.r.command config parameter

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to