When starting a local-mode Spark instance, e.g., new
SparkContext("local[4]"), what memory configuration options are
available/considered to limit Spark's memory usage?

For instance, if I have a JVM with 64GB and would like to reserve/limit
Spark to using only 32GB of the heap.

thanks!

Reply via email to