What version is this with? We used to build each partition first before writing 
it out, but this was fixed a while back (0.9.1, but it may also be in 0.9.0).

Matei

On May 19, 2014, at 12:41 AM, Sai Prasanna <ansaiprasa...@gmail.com> wrote:

> Hi all,
> 
> When i gave the persist level as DISK_ONLY, still Spark tries to use memory 
> and caches.
> Any reason ?
> Do i need to override some parameter elsewhere ?
> 
> Thanks !

Reply via email to