You can set those inside the spark-defaults.conf file under the conf
directory inside your spark installation.

Thanks
Best Regards

On Wed, Nov 5, 2014 at 4:51 PM, Ashic Mahtab <as...@live.com> wrote:

> Hi,
> The docs specify that we can control the amount of ram / cores available
> via:
>
> -c CORES, --cores CORESTotal CPU cores to allow Spark applications to use
> on the machine (default: all available); only on worker-m MEM, --memory
> MEMTotal amount of memory to allow Spark applications to use on the
> machine, in a format like 1000M or 2G (default: your machine's total RAM
> minus 1 GB); only on worker
> Ommitting these values would cause them to take on defaults. Is there a
> way of "specifying" the default? Or is the only way for it to take on
> default values is to ommit the parameters? Will -c default and -m default
> work?
>
> Thanks,
> Ashic.
>

Reply via email to