Dear all:
Spark uses memory to cache RDD and the memory size is specified by
spark.storage.memoryFraction.
One the Executor starts, does Spark support adjusting/resizing memory size
of this part dynamically?
Thanks.
--
*Regards,*
*Zhaojie*
AFAIK, No.
Best Regards,
Raymond Liu
From: 牛兆捷 [mailto:nzjem...@gmail.com]
Sent: Thursday, September 04, 2014 11:30 AM
To: user@spark.apache.org
Subject: resize memory size for caching RDD
Dear all:
Spark uses memory to cache RDD and the memory size is specified