Re: settings from props file seem to be ignored in mesos

2015-06-16 Thread Gary Ogden
There isn't a conf/spark-defaults.conf file in the .tgz. There's a template
file, but we didn't think we'd need one.  I assumed using the defaults and
anything we wanted to override would be in the properties file we load via
--properties-file, as well as command line parms (--master etc).



On 16 June 2015 at 04:34, Akhil Das ak...@sigmoidanalytics.com wrote:

 Whats in your executor (that .tgz file) conf/spark-default.conf file?

 Thanks
 Best Regards

 On Mon, Jun 15, 2015 at 7:14 PM, Gary Ogden gog...@gmail.com wrote:

 I'm loading these settings from a properties file:
 spark.executor.memory=256M
 spark.cores.max=1
 spark.shuffle.consolidateFiles=true
 spark.task.cpus=1
 spark.deploy.defaultCores=1
 spark.driver.cores=1
 spark.scheduler.mode=FAIR

 Once the job is submitted to mesos, I can go to the spark UI for that job
 (hostname:4040) and on the environment tab. I see that those settings are
 there.

 If I then comment out all those settings and allow spark to use the
 defaults, it still appears to use the same settings in mesos.

 Under both runs, it still shows 1 task, 3 cpu, 1GB memory.

 Nothing seems to change no matter what is put in that props file, even if
 they show up in the spark environment tab.





Re: settings from props file seem to be ignored in mesos

2015-06-16 Thread Akhil Das
Whats in your executor (that .tgz file) conf/spark-default.conf file?

Thanks
Best Regards

On Mon, Jun 15, 2015 at 7:14 PM, Gary Ogden gog...@gmail.com wrote:

 I'm loading these settings from a properties file:
 spark.executor.memory=256M
 spark.cores.max=1
 spark.shuffle.consolidateFiles=true
 spark.task.cpus=1
 spark.deploy.defaultCores=1
 spark.driver.cores=1
 spark.scheduler.mode=FAIR

 Once the job is submitted to mesos, I can go to the spark UI for that job
 (hostname:4040) and on the environment tab. I see that those settings are
 there.

 If I then comment out all those settings and allow spark to use the
 defaults, it still appears to use the same settings in mesos.

 Under both runs, it still shows 1 task, 3 cpu, 1GB memory.

 Nothing seems to change no matter what is put in that props file, even if
 they show up in the spark environment tab.



settings from props file seem to be ignored in mesos

2015-06-15 Thread Gary Ogden
I'm loading these settings from a properties file:
spark.executor.memory=256M
spark.cores.max=1
spark.shuffle.consolidateFiles=true
spark.task.cpus=1
spark.deploy.defaultCores=1
spark.driver.cores=1
spark.scheduler.mode=FAIR

Once the job is submitted to mesos, I can go to the spark UI for that job
(hostname:4040) and on the environment tab. I see that those settings are
there.

If I then comment out all those settings and allow spark to use the
defaults, it still appears to use the same settings in mesos.

Under both runs, it still shows 1 task, 3 cpu, 1GB memory.

Nothing seems to change no matter what is put in that props file, even if
they show up in the spark environment tab.