Also, can you include MaxPermSize fix to spark-1.6.3?
https://issues.apache.org/jira/browse/SPARK-15067
Literally, just 1 word should be replaced
https://github.com/apache/spark/pull/12985/files



On Fri, Oct 14, 2016 at 1:57 PM, Alexander Pivovarov <apivova...@gmail.com>
wrote:

> Hi Reynold
>
> Spark 1.6.x has serious bug related to shuffle functionality
> https://issues.apache.org/jira/browse/SPARK-14560
> https://issues.apache.org/jira/browse/SPARK-4452
>
> Shuffle throws OOM on serious load. I've seen this error several times on
> my heavy jobs
>
> java.lang.OutOfMemoryError: Unable to acquire 75 bytes of memory, got 0
>         at 
> org.apache.spark.memory.MemoryConsumer.allocatePage(MemoryConsumer.java:120)
>         at 
> org.apache.spark.shuffle.sort.ShuffleExternalSorter.acquireNewPageIfNecessary(ShuffleExternalSorter.java:346)
>
>
> It was fixed in both spark-2.0.0 and spark-1.6.x BUT spark-1.6 fix was NOT
> merged - https://github.com/apache/spark/pull/13027
>
> Is it possible to include the fix to spark-1.6.3?
>
>
> Thank you
> Alex
>
>
> On Fri, Oct 14, 2016 at 1:39 PM, Reynold Xin <r...@databricks.com> wrote:
>
>> It's been a while and we have fixed a few bugs in branch-1.6. I plan to
>> cut rc1 for 1.6.3 next week (just in time for Spark Summit Europe). Let me
>> know if there are specific issues that should be addressed before that.
>> Thanks.
>>
>
>

Reply via email to