please find attached the screenshot of no active task but memory i still
used .
[image: image.png]
On Sat, Nov 21, 2020 at 4:25 PM Amit Sharma wrote:
> I am using df.cache and also unpersisting it. But when I check spark Ui
> storage I still see cache memory usage. Do I need to do any thing
Hi,
I am not sure if you were writing pseudo-code or real one but there were
few issues in the sql.
I have reproduced you example in the Spark REPL and all worked as
expected and result is the one you need
Please see below full code:
## *Spark 3.0.0*
>>> a = spark.read.csv("tab1",
I am using df.cache and also unpersisting it. But when I check spark Ui
storage I still see cache memory usage. Do I need to do any thing else.
Also in executor tab on spark Ui for each executor memory used/total memory
always display some used memory not sure if no request on streaming job
then