AW: AW: -Dspark.jars is ignored when running in yarn-client mode, also when adding the jar with sc.addJars

2016-02-29 Thread Rabe, Jens
Once the library is ready for release, I am going to put it on our company-internal Nexus server, but as of now, it is still work in progress. Von: Felipe Almeida [mailto:falmeida1...@gmail.com] Gesendet: Sonntag, 28. Februar 2016 00:23 An: users@zeppelin.incubator.apache.org Betreff: Re: AW:

Re: AW: -Dspark.jars is ignored when running in yarn-client mode, also when adding the jar with sc.addJars

2016-02-27 Thread Felipe Almeida
You can also add maven packages and spark will download it (along with any dependencies), just use the --packages directive. There's a little example at the end of this post but I'm still working on it:

AW: -Dspark.jars is ignored when running in yarn-client mode, also when adding the jar with sc.addJars

2016-02-26 Thread Rabe, Jens
Hello, I found out ahow to add the library. Since I run Spark with spark-submit, I have to add the option to the SPARK_SUBMIT_OPTIONS variable, so I added: export SPARK_SUBMIT_OPTIONS="--jars /home/zeppelin/jars/mylib.jar" Now it works. This should be added to the documentation though. Von:

-Dspark.jars is ignored when running in yarn-client mode, also when adding the jar with sc.addJars

2016-02-26 Thread Rabe, Jens
Hello, I have a library I want to embed in Zeppelin. I am using a build from Git yesterday, and Spark 1.6. Here is my conf/zeppelin-env.sh: export JAVA_HOME=/usr/lib/jvm/java-7-oracle export MASTER=yarn-client export HADOOP_CONF_DIR=/etc/hadoop/conf export ZEPPELIN_PORT=10080 export