Re: releasing memory without stopping the spark context ?

2016-08-31 Thread Mich Talebzadeh
Spark memory is the sum of execution memory and storage memory. unpersist only removes the storage memory. Execution memory is there which is what is Spark all about. HTH Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw

RE: releasing memory without stopping the spark context ?

2016-08-31 Thread Rajani, Arpan
Removing Data Spark automatically monitors cache usage on each node and drops out old data partitions in a least-recently-used (LRU) fashion. If you would like to manually remove an RDD instead of waiting for it to fall out of the cache, use the RDD.unpersist() method. (Copied from