I've tried the 0.8.0-rc4 on my EMR cluster using the preinstalled version
of spark under /usr/lib/spark.

This works fine in local or yarn-client mode, but in yarn-cluster mode i
just get a

java.lang.NullPointerException at
org.apache.zeppelin.spark.NewSparkInterpreter.setupConfForPySpark(NewSparkInterpreter.java:149)

Seems to be caused by an unsuccessful search for the py4j libraries.
I've made sure that SPARK_HOME is actually set in .bash_rc, in
zeppelin-env.sh and via the new %spark.conf, but somehow in the remote
interpreter, something odd is going on.

Best regards,
 Thomas

Reply via email to