[
https://issues.apache.org/jira/browse/SPARK-16581?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15429858#comment-15429858
]
Shivaram Venkataraman commented on SPARK-16581:
-----------------------------------------------
These are good points -- The way I see it is that its a question what are the
various levels at which we present APIs to package developers. You are right
that the contract we have with the R->JVM API is that the run functions on the
JVM running Spark. If we are going to enable multiple JVMs (or say multiple
Spark contexts) then we will have to add some way to resolve which JVM to run.
That might change the implementation but should not affect the API
significantly I'd say (it would be something like passing a JVM id in the API).
The second question is about whether we want to expose the more internal
RBackend as a public API -- This is something we will have to discuss
separately and I am not sure we are ready to make such an RPC API public yet.
The remote backend change is still assuming the client and the server can
change hand-in-hand. Making RBackend public will mean that the server side API
is fixed.
> Making JVM backend calling functions public
> -------------------------------------------
>
> Key: SPARK-16581
> URL: https://issues.apache.org/jira/browse/SPARK-16581
> Project: Spark
> Issue Type: Sub-task
> Components: SparkR
> Reporter: Shivaram Venkataraman
>
> As described in the design doc in SPARK-15799, to help packages that need to
> call into the JVM, it will be good to expose some of the R -> JVM functions
> we have.
> As a part of this we could also rename, reformat the functions to make them
> more user friendly.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]