>>> I have implemented to run interpreters in yarn cluster and succeed to 
>>> launches
SparkInterpreter with local mode in yarn cluster.
Do you mean you can run spark intepreter in yarn cluster mode with
SPARK_HOME set ?

IIUC, running spark interpreter in yarn-client or yarn-cluster mode require
SPARK_HOME and HADOOP_CONF_DIR






Jongyoul Lee <jongy...@gmail.com>于2017年5月9日周二 上午10:52写道:

> I have implemented to run interpreters in yarn cluster and succeed to
> launches SparkInterpreter with local mode in yarn cluster. BTW, I've tested
> it with yarn-client but it needs to set SPARK_HOME.
>
> I'm not sure if it was possible before, but I want to do it with embedded
> spark.
>
> Thanks for testing it.
>
>
>
> On Wed, May 10, 2017 at 2:16 AM, Jeff Zhang <zjf...@gmail.com> wrote:
>
> > yarn-client mode doesn't work for embeded spark. But does it work before
> ?
> > I think embeded spark should only work with local mode
> >
> >
> >
> >
> > Jongyoul Lee <jongy...@gmail.com>于2017年5月9日周二 上午10:02写道:
> >
> > > Hi devs,
> > >
> > > I need you help to verify some mode of SparkInterpreter. I saw the
> > message
> > > below from spark website in yarn mode:
> > >
> > > ```
> > > To make Spark runtime jars accessible from YARN side, you can specify
> > > spark.yarn.archive or spark.yarn.jars. For details please refer to
> Spark
> > > Properties
> > > <http://spark.apache.org/docs/latest/running-on-yarn.html#
> > spark-properties
> > > >.
> > > If neither spark.yarn.archive nor spark.yarn.jars is specified, Spark
> > will
> > > create a zip file with all jars under $SPARK_HOME/jarsand upload it to
> > the
> > > distributed cache.
> > > ```
> > >
> > > It means if you use internal spark, you cannot run yarn mode in current
> > > master. Can anyone test it and let me know the result?
> > >
> > > Thanks in advance,
> > > Jongyoul
> > >
> > > --
> > > 이종열, Jongyoul Lee, 李宗烈
> > > http://madeng.net
> > >
> >
>
>
>
> --
> 이종열, Jongyoul Lee, 李宗烈
> http://madeng.net
>

Reply via email to