Hi,

I am running Livy server in connection with Spark without Hadoop. I am
setting only SPARK_HOME and I am getting this in Livy UI logs after
job submission.

I am using pretty much standard configuration but
livy.spark.deploy-mode = cluster

Do I need to run with Hadoop installation as well and specify HADOOP_CONF_DIR?

Is not it possible to run Livy with "plain" Spark without YARN?

stderr:
java.lang.ClassNotFoundException:
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.util.Utils$.classForName(Utils.scala:230)
at 
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:712)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Thanks!

-- 
Stefan Miklosovic

Reply via email to