There isn't a conf/spark-defaults.conf file in the .tgz. There's a template
file, but we didn't think we'd need one. I assumed using the defaults and
anything we wanted to override would be in the properties file we load via
--properties-file, as well as command line parms (--master etc).
On 16
Whats in your executor (that .tgz file) conf/spark-default.conf file?
Thanks
Best Regards
On Mon, Jun 15, 2015 at 7:14 PM, Gary Ogden wrote:
> I'm loading these settings from a properties file:
> spark.executor.memory=256M
> spark.cores.max=1
> spark.shuffle.consolidateFiles=true
> spark.task.c
I'm loading these settings from a properties file:
spark.executor.memory=256M
spark.cores.max=1
spark.shuffle.consolidateFiles=true
spark.task.cpus=1
spark.deploy.defaultCores=1
spark.driver.cores=1
spark.scheduler.mode=FAIR
Once the job is submitted to mesos, I can go to the spark UI for that job