Hi,

I was looking at SparkUI, Executors, and I noticed that I have 597 MB for
 "Shuffle while I am using cached temp-table and the Spark had 2 GB free
memory (the number under Memory Used is 597 MB /2.6 GB) ?!!!

Shouldn't be Shuffle Write be zero and everything (map/reduce) tasks be
done in memory?

best,

/Shahab

Reply via email to