jpcorreia99 commented on code in PR #45240:
URL: https://github.com/apache/spark/pull/45240#discussion_r1505739465


##########
docs/configuration.md:
##########
@@ -202,6 +202,15 @@ of the most common options to set are:
   </td>
   <td>2.3.0</td>
 </tr>
+<tr>
+  <td><code>spark.driver.minMemoryOverhead</code></td>
+  <td>None</td>
+  <td>
+    The minimum amount of non-heap memory to be allocated per driver process 
in cluster mode, in MiB unless otherwise specified, if 
<code>spark.driver.memoryOverhead</code> is not defined.
+    This option is currently supported on YARN and Kubernetes.
+  </td>
+  <td>3.5.2</td>

Review Comment:
   What value should be added here? I could not find a roadmap of future 
releases for spark, but saw the latest release is '3.5.1'



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to