On Mon, Jul 1, 2019 at 9:08 PM Eric Charles <e...@apache.org> wrote:
>
> Hi,
>
> I have lately commented on the PySpark and SparkR interpreters Removal Pull 
> Request.
>
> https://github.com/apache/incubator-toree/pull/166#issuecomment-500356895
>
> If it is too late to bring them back in the picture: I wonder
>
> - What are your advices to benefit from a shared spark context between 
> Scala/Python/R?
> - What were the encountered difficulties with that integration - I wonder how 
> possible it is for me to maintain that in a fork?
>
> Thx, Eric
>

I would say that it mostly started as a prototype and there were
issues being reported that were not being addressed by the community.

In counterpart, there were other feature-rich kernel implementations
such as IPython and IRKernel that were providing the necessary
integration with Spark.

As for shared context support, see comments below from @Gino Bustelo
https://www.mail-archive.com/dev@toree.incubator.apache.org/msg01873.html


-- 
Luciano Resende
http://twitter.com/lresende1975
http://lresende.blogspot.com/

Reply via email to