You can set it to MEMORY_AND_DISK, in this case data will fall back to disk
when it exceeds the memory.
Thanks
Best Regards
On Fri, Oct 23, 2015 at 9:52 AM, JoneZhang wrote:
> 1.Whether Spark will use disk when the memory is not enough on MEMORY_ONLY
> Storage Level?
>
Jone:
For #3, consider ask on vendor's mailing list.
On Fri, Oct 30, 2015 at 7:11 AM, Akhil Das
wrote:
> You can set it to MEMORY_AND_DISK, in this case data will fall back to
> disk when it exceeds the memory.
>
> Thanks
> Best Regards
>
> On Fri, Oct 23, 2015 at
1.Whether Spark will use disk when the memory is not enough on MEMORY_ONLY
Storage Level?
2.If not, How can i set Storage Level when i use Hive on Spark?
3.Do Spark have any intention of dynamically determined Hive on MapReduce
or Hive on Spark, base on SQL features.
Thanks in advance
Best