[ 
https://issues.apache.org/jira/browse/SPARK-6816?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14588271#comment-14588271
 ] 

Rick Moritz commented on SPARK-6816:
------------------------------------

Apparently this work-around is no longer needed for spark-1.4.0, which invokes 
a shell script instead of going directly to java as sparkR-pkg did, and fetches 
the required environment parameters.
With spark-defaults being respected, and SPARK_MEM available for memory 
options, there probably isn't a whole lot that needs to be passed by -D to 
shell script.

> Add SparkConf API to configure SparkR
> -------------------------------------
>
>                 Key: SPARK-6816
>                 URL: https://issues.apache.org/jira/browse/SPARK-6816
>             Project: Spark
>          Issue Type: New Feature
>          Components: SparkR
>            Reporter: Shivaram Venkataraman
>            Priority: Minor
>
> Right now the only way to configure SparkR is to pass in arguments to 
> sparkR.init. The goal is to add an API similar to SparkConf on Scala/Python 
> to make configuration easier



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to