Github user felixcheung commented on the issue:

    https://github.com/apache/spark/pull/17941
  
    @cloud-fan thank you for chiming in. I have been looking for some feedback 
and I'm hoping we could get something more definitive.
    
    I'm sure there is a need to share context between different languages. 
Having been on the consuming side for a long time I'm surprised the lack of 
incentive to formalize how to share context, and to avoid private methods 
getting called, wire protocol reverse-engineered and so on.
    
    I'm aware of at least 5 or 6 unique implementations of `R - to - Spark`. 
Among these, I'm pretty certain only `databricks` has the proper support for 
multiple SparkSessions. There is only 1 other implementation that has sharing 
and it does so at the SparkSession-level (ie. there is only one session across 
all languages).
    
    So I think sharing context is an orthogonal question, and can be achieved 
in several different ways.
    
    Even if there isn't multiple/default/active session support, or that it 
wouldn't applicable, I still think it makes sense to have the concept of a 
global session, if nothing else we could get that API parity. So all I'm saying 
is we should have that in SparkR so that any other APIs working with global 
sessions only would actually work with SparkR by itself.
    
    And for that, a proposal was made, and from what I can see implementation 
isn't going to be hard.
    
    What do people think?



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to