Hi Warren,
it's often an exception stemming from an OOM at the executor level.
If you are caching data make sure you spill to disk, if needed.
You could also try to increase off-heap memory to alleviate the issue.
Of course also handing more memory to the executor helps.
Best regards,
Alessandr
Hi All,
I have seen this exception many times in my production environment for long
running batch job. Is there some stigmatization of all root causes of this
exception? Below is my analysis:
1. This happens when executor try to fetch MapStatus of some shuffle.
2. Each executor maintains a local
Hi,
I am able to make it in spark 2.3.0, If you can change the version to spark
2.3 it will be good , otherwise let me know , i 'll check on the Spark
version 2.1.0.
Following is the code for spark 2.3.0.
scala> var seq = Seq((10L,"Hello"),(10L,"Hi"))
seq: Seq[(Long, String)] = List((10,Hello),