The issue with the offheap mode is it is a pretty big behavior change and does 
require additional setup (also for users that run with UDFs that allocate a lot 
of heap memory, it might not be as good).

I can see us removing the legacy mode since it's been legacy for a long time 
and perhaps very few users need it. How much code does it remove though?

On Thu, Jan 03, 2019 at 2:55 PM, Sean Owen < sro...@apache.org > wrote:

> 
> 
> 
> Just wondering if there is a good reason to keep around the pre-tungsten
> on-heap memory mode for Spark 3, and make spark.memory.offHeap.enabled
> always true? It would simplify the code somewhat, but I don't feel I'm so
> aware of the tradeoffs.
> 
> 
> 
> I know we didn't deprecate it, but it's been off by default for a long
> time. It could be deprecated, too.
> 
> 
> 
> Same question for spark.memory.useLegacyMode and all its various
> associated settings? Seems like these should go away at some point, and
> Spark 3 is a good point. Same issue about deprecation though.
> 
> 
> 
> --------------------------------------------------------------------- To
> unsubscribe e-mail: dev-unsubscribe@ spark. apache. org (
> dev-unsubscr...@spark.apache.org )
> 
> 
>

Reply via email to