Re: Help verify yarn client mode without SPARK_HOME

2017-05-13 Thread Jongyoul Lee
That's because Spark have to read HADOOP_CONF_DIR and have to compress jars from SPARK_HOME. If we can handle it by ourself, we don't have to add it into Spark. On Wed, May 10, 2017 at 5:19 AM, Jeff Zhang wrote: > >>> I have implemented to run interpreters in yarn cluster and

Re: Help verify yarn client mode without SPARK_HOME

2017-05-09 Thread Jeff Zhang
>>> I have implemented to run interpreters in yarn cluster and succeed to >>> launches SparkInterpreter with local mode in yarn cluster. Do you mean you can run spark intepreter in yarn cluster mode with SPARK_HOME set ? IIUC, running spark interpreter in yarn-client or yarn-cluster mode require

Re: Help verify yarn client mode without SPARK_HOME

2017-05-09 Thread Jongyoul Lee
I have implemented to run interpreters in yarn cluster and succeed to launches SparkInterpreter with local mode in yarn cluster. BTW, I've tested it with yarn-client but it needs to set SPARK_HOME. I'm not sure if it was possible before, but I want to do it with embedded spark. Thanks for

Re: Help verify yarn client mode without SPARK_HOME

2017-05-09 Thread Jeff Zhang
yarn-client mode doesn't work for embeded spark. But does it work before ? I think embeded spark should only work with local mode Jongyoul Lee 于2017年5月9日周二 上午10:02写道: > Hi devs, > > I need you help to verify some mode of SparkInterpreter. I saw the message > below from

Help verify yarn client mode without SPARK_HOME

2017-05-09 Thread Jongyoul Lee
Hi devs, I need you help to verify some mode of SparkInterpreter. I saw the message below from spark website in yarn mode: ``` To make Spark runtime jars accessible from YARN side, you can specify spark.yarn.archive or spark.yarn.jars. For details please refer to Spark Properties