Hi David,

With the master branch (0.6.0-SNAPSHOT),
1) download and configure spark and make sure everything works with
bin/spark-shell command. If you have extra jar, you can add spark.files
property in SPARK_HOME/conf/spark-defaults.conf
2) export SPARK_HOME in conf/zeppelin-env.sh file.
3) Start Zeppelin and configure master, spark.executor.memory, etc in
Interpreter menu.

I think these steps are easiest way to set up Zeppelin with extra jars.

Thanks,
moon

On Thu, Sep 17, 2015 at 5:41 PM David Salinas <david.salinas....@gmail.com>
wrote:

> Hi,
>
> I have just tested without setting the classpath in the conf and 1/
> worked. Could you tell me what is the best way to set up your classpath/jar
> now?
>
> Best,
>
> David
>
> On Thu, Sep 17, 2015 at 10:34 AM, David Salinas <
> david.salinas....@gmail.com> wrote:
>
>> Hi,
>>
>> I have tried this example after
>> https://github.com/apache/incubator-zeppelin/pull/270.
>>
>> But it is not working for several reasons:
>>
>> 1/ Zeppelin context is not found (!):
>> val s = z.input("Foo")
>> <console>:21: error: not found: value z
>>
>> 2/ If I include my jar, the classpath is not communicated to slaves, the
>> code works only locally (it used to work on the cluster before this
>> change), I guess there is something wrong with the way I set the classpath
>> (which is also probably linked to 1/)
>>
>> I have added this line in zeppelin-env.sh to use one of my jar
>> export ZEPPELIN_JAVA_OPTS="-Dspark.driver.host=`hostname`
>> -Dspark.mesos.coarse=true -Dspark.executor.memory=20g -Dspark.cores.max=80
>> -Dspark.jars=${SOME_JAR} -cp ${SOME_CLASSPATH_FOR_THE_JAR}
>>
>> How can one add its extraclasspath jar with this new version? Could you
>> add an ZEPPELIN_EXTRA_JAR, ZEPPELIN_EXTRA_CLASSPATH to zeppelin-env.sh so
>> that the user can add easily his code?
>>
>> Best,
>>
>> David
>>
>
>

Reply via email to