Hi, there My understanding is that the cache storage is calculated as following
executor heap size * spark.storage.safetyFraction * spark.storage.memoryFraction. The default value for safetyFraction is 0.9 and memoryFraction is 0.6. When I started a spark job on YARN, I set executor-memory to be 6g. thus I expect the memory cache to be 6 * 0.9 * 0.6 = 3.24g. However, on the Spark history server, it shows the reserved cached size for each executor is 3.1g. So it does not add up. What do I miss? Lan