I have a spark job which runs fine for small data. But when data
increases it gives executor lost error.My executor and driver memory are
set at its highest point. I have also tried increasing--conf
spark.yarn.executor.memoryOverhead=600but still not able to fix the
problem. Is there any other solution to fix the problem?
- Spark Executor Lost issue Aditya
- Re: Spark Executor Lost issue Aditya
- Re: Spark Executor Lost issue Aditya