Hello,

is this new? In earlier Zeppelin versions I could just set the mentioned 
options and it worked in yarn-client mode.

> Am 29.12.2015 um 20:40 schrieb Hyung Sung Shim <hss...@nflabs.com>:
> 
> Hello Jens Rabe.
> 
> If you want to run zeppelin using spark-submit, you should set variable 
> SPARK_HOME to zeppelin-env.sh.
> 
> Thanks.
> 
> 
> 
> 2015-12-30 4:18 GMT+09:00 Jens Rabe <rabe-j...@t-online.de 
> <mailto:rabe-j...@t-online.de>>:
> Hello,
> 
> I am trying to set up Zeppelin to use Spark on YARN. Spark on YARN itself 
> works, I can use spark-submit and spark-shell. So I set up Zeppelin and my 
> zeppelin-env.sh contains the following:
> 
> #!/bin/bash
> 
> export JAVA_HOME=/usr/lib/jvm/java-7-oracle
> export MASTER=yarn-client                     # Spark master url. eg. 
> spark://master_addr:7077. Leave empty if you want to use local mode.
> export ZEPPELIN_JAVA_OPTS="-Dspark.dynamicAllocation.enabled=true 
> -Dspark.shuffle.service.enabled=true"           # Additional jvm options. for 
> example, export ZEPPELIN_JAVA_OPTS="-Dspark.executor.memory=8g 
> -Dspark.cores.max=16"
> export ZEPPELIN_PORT=10080
> export HADOOP_CONF_DIR=/opt/hadoop/etc/hadoop
> 
> I double-checked that /opt/hadoop/etc/hadoop really contains the correct 
> configuration files, and it does. zeppelin-env-sh is executable, too. But 
> when I start Zeppelin and try to submit something, it tries to connect to a 
> YARN RM at 127.0.0.1. It seems that it ignores HADOOP_CONF_DIR.
> 
> Is this a bug or am I missing something?
> 
> - Jens
> 

Reply via email to