Hi All:

We know some memory of spark are used for computing (e.g.,
spark.shuffle.memoryFraction) and some are used for caching RDD for future
use (e.g., spark.storage.memoryFraction).

Is there any existing workload which can utilize both of them during the
running left cycle? I want to do some performance by adjusting the ratio of
them.

Thanks.

-- 
*Regards,*
*Zhaojie*

Reply via email to