t;user @spark" <user@spark.apache.org (mailto:user@spark.apache.org)>
Subject: Re: trouble understanding data frame memory usage
³java.io.IOException: Unable to acquire memory²
Unfortunately in 1.5 we didn't force operators to spill when ran out of memory
so there is not a lot y
gt;>> Coalesce(1) spark-1.6.0
>>>
>>> Caused by:
>>> java.lang.OutOfMemoryError: Unable to acquire 28 bytes of memory, got 0
>>>
>>> Hope this helps
>>>
>>> Andy
>>>
>>> From: Michael Armbrust <mich...@databricks.com
>>&
n.com>
Cc: "user @spark" <user@spark.apache.org>
Subject: Re: trouble understanding data frame memory usage
³java.io.IOException: Unable to acquire memory²
> Unfortunately in 1.5 we didn't force operators to spill when ran out of memory
> so there is not a lot you can do. It
a...@santacruzintegration.com)>
> Cc: "user @spark" <user@spark.apache.org (mailto:user@spark.apache.org)>
> Subject: Re: trouble understanding data frame memory usage
> ³java.io.IOException: Unable to acquire memory²
>
> > Unfortunately in 1.5 we didn't force
I am using spark 1.5.1. I am running into some memory problems with a java
unit test. Yes I could fix it by setting Xmx (its set to 1024M) how ever I
want to better understand what is going on so I can write better code in the
future. The test runs on a Mac, master="Local[2]"
I have a java unit
Unfortunately in 1.5 we didn't force operators to spill when ran out of
memory so there is not a lot you can do. It would be awesome if you could
test with 1.6 and see if things are any better?
On Mon, Dec 28, 2015 at 2:25 PM, Andy Davidson <
a...@santacruzintegration.com> wrote:
> I am using
the 200 number looks strangely similar to the following default number of
post-shuffle partitions which is often left untuned:
spark.sql.shuffle.partitions
here's the property defined in the Spark source:
are reducing the number of partitions.
Kind regards
Andy
From: Michael Armbrust <mich...@databricks.com>
Date: Monday, December 28, 2015 at 2:41 PM
To: Andrew Davidson <a...@santacruzintegration.com>
Cc: "user @spark" <user@spark.apache.org>
Subject: Re: trouble und
> To: Andrew Davidson <a...@santacruzintegration.com>
> Cc: "user @spark" <user@spark.apache.org>
> Subject: Re: trouble understanding data frame memory usage
> ³java.io.IOException: Unable to acquire memory²
>
> Unfortunately in 1.5 we didn't force operators to sp