Still you need sparkR
> On 29 Jun 2016, at 19:14, John Aherne <john.ahe...@justenough.com> wrote:
>
> Microsoft Azure has an option to create a spark cluster with R Server. MS
> bought RevoScale (I think that was the name) and just recently deployed it.
>
>> On Wed, Jun 29, 2016 at 10:53 AM, Xinh Huynh <xinh.hu...@gmail.com> wrote:
>> There is some new SparkR functionality coming in Spark 2.0, such as
>> "dapply". You could use SparkR to load a Parquet file and then run "dapply"
>> to apply a function to each partition of a DataFrame.
>>
>> Info about loading Parquet file:
>> http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc1-docs/sparkr.html#from-data-sources
>>
>> API doc for "dapply":
>> http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc1-docs/api/R/index.html
>>
>> Xinh
>>
>>> On Wed, Jun 29, 2016 at 6:54 AM, sujeet jog <sujeet....@gmail.com> wrote:
>>> try Spark pipeRDD's , you can invoke the R script from pipe , push the
>>> stuff you want to do on the Rscript stdin, p
>>>
>>>
>>>> On Wed, Jun 29, 2016 at 7:10 PM, Gilad Landau <gilad.lan...@clicktale.com>
>>>> wrote:
>>>> Hello,
>>>>
>>>>
>>>>
>>>> I want to use R code as part of spark application (the same way I would do
>>>> with Scala/Python). I want to be able to run an R syntax as a map
>>>> function on a big Spark dataframe loaded from a parquet file.
>>>>
>>>> Is this even possible or the only way to use R is as part of RStudio
>>>> orchestration of our Spark cluster?
>>>>
>>>>
>>>>
>>>> Thanks for the help!
>>>>
>>>>
>>>>
>>>> Gilad
>>>>
>
>
>
> --
> John Aherne
> Big Data and SQL Developer
>
>
>
> Cell:
> Email:
> Skype:
> Web:
>
> +1 (303) 809-9718
> john.ahe...@justenough.com
> john.aherne.je
> www.justenough.com
>
>
> Confidentiality Note: The information contained in this email and document(s)
> attached are for the exclusive use of the addressee and may contain
> confidential, privileged and non-disclosable information. If the recipient of
> this email is not the addressee, such recipient is strictly prohibited from
> reading, photocopying, distribution or otherwise using this email or its
> contents in any way.