forgot to reply all.

I see. but what prevents e.g. R driver getting those command line arguments
from spark-submit and setting them with SparkConf to R diver's
in-process JVM through JNI?

On Thu, Sep 10, 2015 at 9:29 PM, Shivaram Venkataraman <
shiva...@eecs.berkeley.edu> wrote:

> Yeah in addition to the downside of having 2 JVMs the command line
> arguments and SparkConf etc. will be set by spark-submit in the first
> JVM which won't be available in the second JVM.
>
> Shivaram
>
> On Thu, Sep 10, 2015 at 5:18 PM, Renyi Xiong <renyixio...@gmail.com>
> wrote:
> > for 2nd case where JVM comes up first, we also can launch in-process JNI
> > just like inter-process mode, correct? (difference is that a 2nd JVM gets
> > loaded)
> >
> > On Thu, Aug 6, 2015 at 9:51 PM, Shivaram Venkataraman
> > <shiva...@eecs.berkeley.edu> wrote:
> >>
> >> The in-process JNI only works out when the R process comes up first
> >> and we launch a JVM inside it. In many deploy modes like YARN (or
> >> actually in anything using spark-submit) the JVM comes up first and we
> >> launch R after that. Using an inter-process solution helps us cover
> >> both use cases
> >>
> >> Thanks
> >> Shivaram
> >>
> >> On Thu, Aug 6, 2015 at 8:33 PM, Renyi Xiong <renyixio...@gmail.com>
> wrote:
> >> > why SparkR chose to uses inter-process socket solution eventually on
> >> > driver
> >> > side instead of in-process JNI showed in one of its doc's below (about
> >> > page
> >> > 20)?
> >> >
> >> >
> >> >
> https://spark-summit.org/wp-content/uploads/2014/07/SparkR-Interactive-R-Programs-at-Scale-Shivaram-Vankataraman-Zongheng-Yang.pdf
> >> >
> >> >
> >
> >
>

Reply via email to