Once the library is ready for release, I am going to put it on our 
company-internal Nexus server, but as of now, it is still work in progress.

Von: Felipe Almeida [mailto:falmeida1...@gmail.com]
Gesendet: Sonntag, 28. Februar 2016 00:23
An: users@zeppelin.incubator.apache.org
Betreff: Re: AW: -Dspark.jars is ignored when running in yarn-client mode, also 
when adding the jar with sc.addJars


You can also add maven packages and spark will  download it (along with any 
dependencies), just use the --packages directive. There's a little example at 
the end of this post but I'm still working on it: 
http://queirozf.com/entries/apache-zeppelin-spark-streaming-and-amazon-kinesis-simple-guide-and-examples

FA
Am 26.02.2016 05:57 schrieb "Rabe, Jens" 
<jens.r...@iwes.fraunhofer.de<mailto:jens.r...@iwes.fraunhofer.de>>:
Hello,

I found out ahow to add the library. Since I run Spark with spark-submit, I 
have to add the option to the SPARK_SUBMIT_OPTIONS variable, so I added:
export SPARK_SUBMIT_OPTIONS="--jars /home/zeppelin/jars/mylib.jar"

Now it works.

This should be added to the documentation though.

Von: Rabe, Jens 
[mailto:jens.r...@iwes.fraunhofer.de<mailto:jens.r...@iwes.fraunhofer.de>]
Gesendet: Freitag, 26. Februar 2016 09:26
An: 
users@zeppelin.incubator.apache.org<mailto:users@zeppelin.incubator.apache.org>
Betreff: -Dspark.jars is ignored when running in yarn-client mode, also when 
adding the jar with sc.addJars

Hello,

I have a library I want to embed in Zeppelin.

I am using a build from Git yesterday, and Spark 1.6.

Here is my conf/zeppelin-env.sh:

export JAVA_HOME=/usr/lib/jvm/java-7-oracle
export MASTER=yarn-client
export HADOOP_CONF_DIR=/etc/hadoop/conf
export ZEPPELIN_PORT=10080
export SPARK_HOME=/opt/spark
export ZEPPELIN_JAVA_OPTS="-Dhdp.version=current 
–Dspark.jars=/home/zeppelin/jars/mylib.jar"

Here is my /opt/spark/conf/spark-defaults.conf:

spark.master yarn-client
spark.dynamicAllocation.enabled true
spark.shuffle.service.enabled true
spark.driver.extraJavaOptions -Dhdp.version=current
spark.yarn.am.extraJavaOptions -Dhdp.version=current

Now, I try to run Zeppelin normally.

When I then try to import something from my lib:

import com.example._

I get:

<console>:27: error: not found: value com

I also tried with “--conf jars=…” and “--jars", to no avail – Zeppelin then 
won’t start because of an “unrecognized option”.

When I do a “ps ax |grep java”, the command line option seems to be passed 
correctly:
  481 ?        Sl     0:07 /usr/lib/jvm/java-7-oracle/bin/java 
-Dhdp.version=current -Dspark.jars=/home/zeppelin/jars/mylib.jar 
-Dfile.encoding=UTF-8 -Xms1024m -Xmx1024m -XX:MaxPermSize=512m 
-Dzeppelin.log.file=/home/zeppelin/incubator-zeppelin/logs/zeppelin--hadoop-frontend.log
 -cp 
::/home/zeppelin/incubator-zeppelin/zeppelin-server/target/lib/*:/home/zeppelin/incubator-zeppelin/zeppelin-zengine/target/lib/*:/home/zeppelin/incubator-zeppelin/zeppelin-interpreter/target/lib/*:/home/zeppelin/incubator-zeppelin/lib/*:/home/zeppelin/incubator-zeppelin/*::/home/zeppelin/incubator-zeppelin/conf:/home/zeppelin/incubator-zeppelin/zeppelin-interpreter/target/classes:/home/zeppelin/incubator-zeppelin/zeppelin-zengine/target/classes:/home/zeppelin/incubator-zeppelin/zeppelin-server/target/classes
 org.apache.zeppelin.server.ZeppelinServer

Even when I upload the mylib.jar to HDFS and use “sc.addJar”, I cannot use it.

What am I missing?

Reply via email to