Probably you have a RDD with Java objects which consume a huge amount of 
memory. If you use RDD you can try Kyroserializer which save memory and may 
even be faster.

> On 29. Oct 2017, at 08:23, Yair Ogen <[email protected]> wrote:
> 
> Hi,
> 
> I'm trying out the ignite-spark support. I have a dataframe that was created 
> from reading a csv file sized around 800MB.
> 
> It seems that When I store the rdd from this dataframe in ignite using 
> saveValues api in IgniteContext it takes around 2GB of RAM.
> 
> Naturally once we add more dataframes, joins and computations we get OOM 
> errors even though we have more than enough RAM.
> 
> Any ideas why the inflated memory?
> 
> Attached is my config.
> 
> Yair
> <ignite-config.xml>

Reply via email to