[ 
https://issues.apache.org/jira/browse/SPARK-8684?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14614288#comment-14614288
 ] 

Vincent Warmerdam commented on SPARK-8684:
------------------------------------------

Mhm. It am noticing that it reminds me that Spark was made for R 3.1 whenever I 
load the `sparkR` script. 

Even if we want R 3.2 for ggplot, we may also need to ensure R 3.2 for other 
packages. Which implies that all slaves also need R 3.2 and that SparkR needs 
to allow this. Has SparkR been tested for that? If that's not the case, we may 
just want to delay this task for when SparkR is ready for 3.2.... 

> Update R version in Spark EC2 AMI
> ---------------------------------
>
>                 Key: SPARK-8684
>                 URL: https://issues.apache.org/jira/browse/SPARK-8684
>             Project: Spark
>          Issue Type: Improvement
>          Components: EC2, SparkR
>            Reporter: Shivaram Venkataraman
>            Priority: Minor
>
> Right now the R version in the AMI is 3.1 -- However a number of R libraries 
> need R version 3.2 and it will be good to update the R version on the AMI 
> while launching a EC2 cluster.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to