Kiruba You can specify the executor memory while starting up the spark shell.
Something like this: MASTER=spark://<spark_hostname>:7077 SPARK_MEM=1g ./spark-shell This will set executor memory to 1GB. More info : https://spark.incubator.apache.org/docs/latest/configuration.html On Fri, Dec 27, 2013 at 12:59 AM, KK R <[email protected]> wrote: > Hi, > Each of the workers are allocated 1GB of memory. When i invoke the spark > shell, only 512MB of the memory is getting used. How can i increase the > memory of the executor for spark shell? > > Thanks, > Kiruba > -- Regards Rakesh Nair
