Were you guys able to to use %spark.dep for %pyspark ?

According to documentation this should work:
https://zeppelin.apache.org/docs/0.7.2/interpreter/spark.html#dependency-management
" Note: %spark.dep interpreter loads libraries to %spark and %spark.pyspark but
not to %spark.sql interpreter.  "

In real life for some reason it doesn't work.. (on recent master)

(as a workaround I add a local jar into --jars option in the
spark_submit_options, but using %spark.dep would be so much nicer).


Thanks,
Ruslan

Reply via email to