I don't think so.

Spark is not keeping the results in memory unless you tell it too.

You have to explicitly call the cache method in your RDD:
linesWithSpark.cache()

Thanks,
Natu




On Fri, Oct 9, 2015 at 10:47 AM, vinod kumar <vinodsachin...@gmail.com>
wrote:

> Hi Guys,
>
> May I know whether cache is enabled in spark by default?
>
> Thanks,
> Vinod
>

Reply via email to