Just a heads up. I opened an issue against Spark about exposing some of the
SparkR APIs. They are too restrictive on what the SparkR libraries allows
(no custom Java methods) and the R backend is package protected.

https://issues.apache.org/jira/browse/SPARK-13573

Our SparkR implementation is a fork of that package that exposes those
APIs, which allows us to use an existing Spark Context as well as retrieve
execution requests from the kernel (same way as we and Zeppelin do it in
PySpark) and more easily interact with the remote Java process containing
the Spark Context.

We've been needing to open this issue for awhile, so I figured I'd do it
now.

Reply via email to