[
https://issues.apache.org/jira/browse/SPARK-12251?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15144986#comment-15144986
]
Ovidiu Marcu commented on SPARK-12251:
--------------------------------------
Reading though the latest documentation for Memory management I can see that
the parameter spark.memory.offHeap.enabled (true by default) is described with
‘If true, Spark will attempt to use off-heap memory for certain operations’ [1].
Can you please describe the certain operations you are referring to?
http://spark.apache.org/docs/latest/configuration.html#memory-management
> Document Spark 1.6's off-heap memory configurations and add config validation
> -----------------------------------------------------------------------------
>
> Key: SPARK-12251
> URL: https://issues.apache.org/jira/browse/SPARK-12251
> Project: Spark
> Issue Type: Improvement
> Components: Spark Core
> Reporter: Josh Rosen
> Assignee: Josh Rosen
> Fix For: 1.6.0
>
>
> We need to document the new off-heap memory limit configurations which were
> added in Spark 1.6, add simple configuration validation (for instance, you
> shouldn't be able to enable off-heap execution when the off-heap memory limit
> is zero), and alias the old and confusing `spark.unsafe.offHeap`
> configuration to something that lives in the `spark.memory` namespace.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]