Hi folks, running into a pretty strange issue:

I'm setting
spark.executor.extraClassPath
spark.driver.extraClassPath

to point to some external JARs. If I set them in spark-defaults.conf
everything works perfectly.
However, if I remove spark-defaults.conf and just create a SparkConf and
call
.set("spark.executor.extraClassPath","...)
.set("spark.driver.extraClassPath",...)

I get ClassNotFound exceptions from Hadoop Conf:

Caused by: java.lang.ClassNotFoundException: Class
org.apache.hadoop.fs.ceph.CephFileSystem not found
        at 
org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1493)
        at 
org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1585)

​

This seems like a bug to me -- or does spark-defaults.conf somehow get
processed differently?

I have dumped out sparkConf.toDebugString and in both cases
(spark-defaults.conf/in code sets) it seems to have the same values in it...

Reply via email to