Re: persist @ disk-only failing

2014-05-19 Thread Sai Prasanna
Ok Thanks!


On Mon, May 19, 2014 at 10:09 PM, Matei Zaharia wrote:

> This is the patch for it: https://github.com/apache/spark/pull/50/. It
> might be possible to backport it to 0.8.
>
> Matei
>
> On May 19, 2014, at 2:04 AM, Sai Prasanna  wrote:
>
> Matei, I am using 0.8.1 !!
>
> But is there a way without moving to 0.9.1 to bypass cache ?
>
>
> On Mon, May 19, 2014 at 1:31 PM, Matei Zaharia wrote:
>
>> What version is this with? We used to build each partition first before
>> writing it out, but this was fixed a while back (0.9.1, but it may also be
>> in 0.9.0).
>>
>> Matei
>>
>> On May 19, 2014, at 12:41 AM, Sai Prasanna 
>> wrote:
>>
>> > Hi all,
>> >
>> > When i gave the persist level as DISK_ONLY, still Spark tries to use
>> memory and caches.
>> > Any reason ?
>> > Do i need to override some parameter elsewhere ?
>> >
>> > Thanks !
>>
>>
>
>


Re: persist @ disk-only failing

2014-05-19 Thread Matei Zaharia
This is the patch for it: https://github.com/apache/spark/pull/50/. It might be 
possible to backport it to 0.8.

Matei

On May 19, 2014, at 2:04 AM, Sai Prasanna  wrote:

> Matei, I am using 0.8.1 !!
> 
> But is there a way without moving to 0.9.1 to bypass cache ?
> 
> 
> On Mon, May 19, 2014 at 1:31 PM, Matei Zaharia  
> wrote:
> What version is this with? We used to build each partition first before 
> writing it out, but this was fixed a while back (0.9.1, but it may also be in 
> 0.9.0).
> 
> Matei
> 
> On May 19, 2014, at 12:41 AM, Sai Prasanna  wrote:
> 
> > Hi all,
> >
> > When i gave the persist level as DISK_ONLY, still Spark tries to use memory 
> > and caches.
> > Any reason ?
> > Do i need to override some parameter elsewhere ?
> >
> > Thanks !
> 
> 



Re: persist @ disk-only failing

2014-05-19 Thread Sai Prasanna
Matei, I am using 0.8.1 !!

But is there a way without moving to 0.9.1 to bypass cache ?


On Mon, May 19, 2014 at 1:31 PM, Matei Zaharia wrote:

> What version is this with? We used to build each partition first before
> writing it out, but this was fixed a while back (0.9.1, but it may also be
> in 0.9.0).
>
> Matei
>
> On May 19, 2014, at 12:41 AM, Sai Prasanna 
> wrote:
>
> > Hi all,
> >
> > When i gave the persist level as DISK_ONLY, still Spark tries to use
> memory and caches.
> > Any reason ?
> > Do i need to override some parameter elsewhere ?
> >
> > Thanks !
>
>


Re: persist @ disk-only failing

2014-05-19 Thread Matei Zaharia
What version is this with? We used to build each partition first before writing 
it out, but this was fixed a while back (0.9.1, but it may also be in 0.9.0).

Matei

On May 19, 2014, at 12:41 AM, Sai Prasanna  wrote:

> Hi all,
> 
> When i gave the persist level as DISK_ONLY, still Spark tries to use memory 
> and caches.
> Any reason ?
> Do i need to override some parameter elsewhere ?
> 
> Thanks !



persist @ disk-only failing

2014-05-19 Thread Sai Prasanna
Hi all,

When i gave the persist level as DISK_ONLY, still Spark tries to use memory
and caches.
Any reason ?
Do i need to override some parameter elsewhere ?

Thanks !