Yeah I saw that on my cheat sheet. It's marked as "Experimental" which was 
somewhat ominous.

Adaryl "Bob" Wakefield, MBA
Principal
Mass Street Analytics, LLC
913.938.6685
www.massstreet.net<http://www.massstreet.net/>
www.linkedin.com/in/bobwakefieldmba<http://www.linkedin.com/in/bobwakefieldmba>
Twitter: @BobLovesData<http://twitter.com/BobLovesData>


From: Felix Cheung [mailto:felixcheun...@hotmail.com]
Sent: Sunday, September 24, 2017 6:56 PM
To: Adaryl Wakefield <adaryl.wakefi...@hotmail.com>; user@spark.apache.org
Subject: Re: using R with Spark

There are other approaches like this

Find Livy on the page
https://blog.rstudio.com/2017/01/24/sparklyr-0-5/

Probably will be best to follow up with sparklyr for any support question.

________________________________
From: Adaryl Wakefield 
<adaryl.wakefi...@hotmail.com<mailto:adaryl.wakefi...@hotmail.com>>
Sent: Sunday, September 24, 2017 2:42:19 PM
To: user@spark.apache.org<mailto:user@spark.apache.org>
Subject: RE: using R with Spark

>It is free for use might need r studio server depending on which spark master 
>you choose.
Yeah I think that's where my confusion is coming from. I'm looking at a cheat 
sheet. For connecting to a Yarn Cluster the first step is;

  1.  Install RStudio Server or RStudio Pro on one of the existing edge nodes.

As a matter of fact, it looks like any instance where you're connecting to a 
cluster requires the paid version of RStudio. All the links I google are 
suggesting this. And then there is this:
https://stackoverflow.com/questions/39798798/connect-sparklyr-to-remote-spark-connection

That's about a year old, but I haven't found anything that contradicts it.

Adaryl "Bob" Wakefield, MBA
Principal
Mass Street Analytics, LLC
913.938.6685
www.massstreet.net<http://www.massstreet.net/>
www.linkedin.com/in/bobwakefieldmba<http://www.linkedin.com/in/bobwakefieldmba>
Twitter: @BobLovesData<http://twitter.com/BobLovesData>


From: Georg Heiler [mailto:georg.kf.hei...@gmail.com]
Sent: Sunday, September 24, 2017 3:39 PM
To: Felix Cheung <felixcheun...@hotmail.com<mailto:felixcheun...@hotmail.com>>; 
Adaryl Wakefield 
<adaryl.wakefi...@hotmail.com<mailto:adaryl.wakefi...@hotmail.com>>; 
user@spark.apache.org<mailto:user@spark.apache.org>
Subject: Re: using R with Spark

No. It is free for use might need r studio server depending on which spark 
master you choose.
Felix Cheung <felixcheun...@hotmail.com<mailto:felixcheun...@hotmail.com>> 
schrieb am So. 24. Sep. 2017 um 22:24:
Both are free to use; you can use sparklyr from the R shell without RStudio 
(but you probably want an IDE)

________________________________
From: Adaryl Wakefield 
<adaryl.wakefi...@hotmail.com<mailto:adaryl.wakefi...@hotmail.com>>
Sent: Sunday, September 24, 2017 11:19:24 AM
To: user@spark.apache.org<mailto:user@spark.apache.org>
Subject: using R with Spark

There are two packages SparkR and sparklyr. Sparklyr seems to be the more 
useful. However, do you have to pay to use it? Unless I'm not reading this 
right, it seems you have to have the paid version of RStudio to use it.

Adaryl "Bob" Wakefield, MBA
Principal
Mass Street Analytics, LLC
913.938.6685
www.massstreet.net<http://www.massstreet.net/>
www.linkedin.com/in/bobwakefieldmba<http://www.linkedin.com/in/bobwakefieldmba>
Twitter: @BobLovesData<http://twitter.com/BobLovesData>


Reply via email to