I've been using JRI to communicate with R from Spark, with some utils to
convert from Scala data types into R datatypes/dataframes etc.
http://www.rforge.net/JRI/
I've been using mapPartitions to push R closures thru JRI and collecting
back the results in Spark. This works reasonably well, though no where as
nicely as straight Spark -- as expected.

I've also been using JavaGD to allow me to use ggplot to visualize data
from Spark -> R, that IMO, is much nicer than anything Java/scala can
provide.


It's interesting to hear of  the R  interface work at AMPLab, anyone there
care to elaborate what will be available and the limitations and possible
the timeframe?


tks
shay



On Wed, Jan 1, 2014 at 8:55 PM, guxiaobo1982 <[email protected]> wrote:

> I read the good news from here:
>
> http://blog.revolutionanalytics.com/2013/12/apache-spark.html
>
>
>
> >> Currently, Spark supports programming interfaces for Scala, 
> >> Java<http://spark.incubator.apache.org/docs/latest/java-programming-guide.html>and
> Python<http://spark.incubator.apache.org/docs/latest/python-programming-guide.html>.
> >> For R users, there is good news: an R interface is in the works and
> under >>development by the team at AMPLab; our sources tell us  this is
> expected to be >>released in the first half of 2014.
>
>
> Regards,
>
> Xiaobo Gu
>

Reply via email to