If you want to use 2g of memory on each worker, you can simply export
SPARK_WORKER_MEMORY=2g inside your spark-env.sh on all machine in the
cluster.
Thanks
Best Regards
On Wed, Apr 8, 2015 at 7:27 AM, Jia Yu jia...@asu.edu wrote:
Hi guys,
Currently I am running Spark program on Amazon EC2.
Hi guys,
Currently I am running Spark program on Amazon EC2. Each worker has around
(less than but near to )2 gb memory.
By default, I can see each worker is allocated 976 mb memory as the table
shows below on Spark WEB UI. I know this value is from (Total memory minus
1 GB). But I want more