Spark 1.6.x has serious bug related to shuffle functionality
Shuffle throws OOM on serious load. I've seen this error several times on
my heavy jobs
java.lang.OutOfMemoryError: Unable to acquire 75 bytes of memory, got 0
It was fixed in both spark-2.0.0 and spark-1.6.x BUT spark-1.6 fix was NOT
merged - https://github.com/apache/spark/pull/13027
Is it possible to include the fix to spark-1.6.3?
On Fri, Oct 14, 2016 at 1:39 PM, Reynold Xin <r...@databricks.com> wrote:
> It's been a while and we have fixed a few bugs in branch-1.6. I plan to
> cut rc1 for 1.6.3 next week (just in time for Spark Summit Europe). Let me
> know if there are specific issues that should be addressed before that.