Re: Difference between spark-defaults.conf and SparkConf.set

2015-07-01 Thread yana
Thanks. Without spark submit it seems the more straightforward solution is to 
just pass it on the driver's classpath. I was more surprised that the same conf 
parameter had different behavior depending on where it's specified. Program vs 
spark-defaults. Im all set now- thanks for replying

 Original message From: Akhil Das 
 Date:07/01/2015  2:27 AM  (GMT-05:00) 
To: Yana Kadiyska  Cc: 
user@spark.apache.org Subject: Re: Difference between 
spark-defaults.conf and SparkConf.set 
.addJar works for me when i run it as a stand-alone application (without 
using spark-submit)

Thanks
Best Regards

On Tue, Jun 30, 2015 at 7:47 PM, Yana Kadiyska  wrote:
Hi folks, running into a pretty strange issue:

I'm setting
spark.executor.extraClassPath 
spark.driver.extraClassPath

to point to some external JARs. If I set them in spark-defaults.conf everything 
works perfectly.
However, if I remove spark-defaults.conf and just create a SparkConf and call 
.set("spark.executor.extraClassPath","...)
.set("spark.driver.extraClassPath",...) 

I get ClassNotFound exceptions from Hadoop Conf:

Caused by: java.lang.ClassNotFoundException: Class 
org.apache.hadoop.fs.ceph.CephFileSystem not found
at 
org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1493)
at 
org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1585)

This seems like a bug to me -- or does spark-defaults.conf somehow get 
processed differently?

I have dumped out sparkConf.toDebugString and in both cases 
(spark-defaults.conf/in code sets) it seems to have the same values in it...



Re: Difference between spark-defaults.conf and SparkConf.set

2015-06-30 Thread Akhil Das
.addJar works for me when i run it as a stand-alone application (without
using spark-submit)

Thanks
Best Regards

On Tue, Jun 30, 2015 at 7:47 PM, Yana Kadiyska 
wrote:

> Hi folks, running into a pretty strange issue:
>
> I'm setting
> spark.executor.extraClassPath
> spark.driver.extraClassPath
>
> to point to some external JARs. If I set them in spark-defaults.conf
> everything works perfectly.
> However, if I remove spark-defaults.conf and just create a SparkConf and
> call
> .set("spark.executor.extraClassPath","...)
> .set("spark.driver.extraClassPath",...)
>
> I get ClassNotFound exceptions from Hadoop Conf:
>
> Caused by: java.lang.ClassNotFoundException: Class 
> org.apache.hadoop.fs.ceph.CephFileSystem not found
> at 
> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1493)
> at 
> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1585)
>
> ​
>
> This seems like a bug to me -- or does spark-defaults.conf somehow get
> processed differently?
>
> I have dumped out sparkConf.toDebugString and in both cases
> (spark-defaults.conf/in code sets) it seems to have the same values in it...
>