Sourav,

There're couple of ways to add external jar when Zeppelin (0.6.0-SNAPSHOT)
uses spark-submit command.

1. Using %dep interpreter, as Vinay mentioned.
    eg)
       %dep
        z.load("group:artifact:version")

        %spark
        import ....

2. By adding spark.files property at SPARK_HOME/conf/spark-defaults.conf
     eg)
        spark.files  /path/to/my.jar

3. By exporting SPARK_SUBMIT_OPTIONS env variable in
ZEPPELIN_HOME/conf/zeppelin-env.sh
     eg)
        export SPARK_SUBMIT_OPTIONS="--packages group:artifact:version"
     note) does not work for pyspark yet.
https://issues.apache.org/jira/browse/ZEPPELIN-339

Hope this helps.

Best,
moon

On Wed, Oct 21, 2015 at 12:55 AM Vinay Shukla <vinayshu...@gmail.com> wrote:

> Saurav,
>
>
> Agree this would be a useful feature. Right now Zeppelin can import
> dependencies from Maven with %dep interpreter. But this does not understand
> spark-packages.
>
>
> -Vinay
>
>
> On Tue, Oct 20, 2015 at 6:54 AM, Sourav Mazumder <
> sourav.mazumde...@gmail.com> wrote:
>
>> Looks like right now there is no way one can pass additional jar files to
>> spark_submit from Zeppelin.The same works fine if I am not using
>> spark_submit option (by not specifying spark_home).
>>
>> When I checked the code interpreter.sh I found that for the class path it
>> only passes the zeppelin-spark*.jar available in
>> zeppelin_home/interpreter/spark directory.
>>
>> I suggest to put this as a bug/enhancement. The solution should be pretty
>> easy by making some small changes in interpreter.sh (I've done the same and
>> could make it work with some external_lib folder under
>> zeppelin_home/interpreter/spark directory).
>>
>> Regards,
>> Sourav
>>
>
>

Reply via email to