spark.executor.memory and spark.driver.memory specifies the size of the JVM
heap for the executor and the driver respectively. You can understand a bit
more about memory usage from here
<https://spark.apache.org/docs/latest/tuning.html#memory-management-overview>
.

On Fri, Apr 17, 2020 at 4:07 PM Pat Ferrel <p...@occamsmachete.com> wrote:

> I have used Spark for several years and realize from recent chatter on
> this list that I don’t really understand how it uses memory.
>
> Specifically is spark.executor.memory and spark.driver.memory taken from
> the JVM heap when does Spark take memory from JVM heap and when it is from
> off JVM heap.
>
> Since spark.executor.memory and spark.driver.memory are job params, I have
> always assumed that the required memory was off-JVM-heap.  Or am I on the
> wrong track altogether?
>
> Can someone point me to a discussion of this?
>
> thanks
>

Reply via email to