Github user Ngone51 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20269#discussion_r238135789
  
    --- Diff: 
core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -38,10 +38,13 @@ package object config {
         
ConfigBuilder("spark.driver.userClassPathFirst").booleanConf.createWithDefault(false)
     
       private[spark] val DRIVER_MEMORY = ConfigBuilder("spark.driver.memory")
    +    .doc("Amount of memory to use for the driver process, in MiB unless 
otherwise specified.")
         .bytesConf(ByteUnit.MiB)
         .createWithDefaultString("1g")
     
       private[spark] val DRIVER_MEMORY_OVERHEAD = 
ConfigBuilder("spark.driver.memoryOverhead")
    +    .doc("The amount of off-heap memory to be allocated per driver in 
cluster mode, " +
    --- End diff --
    
    Hi, @ferdonline , can you explain why this is  **off-heap** memory ?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to