Github user susanxhuynh commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19510#discussion_r147274681
  
    --- Diff: docs/running-on-mesos.md ---
    @@ -344,6 +345,13 @@ See the [configuration page](configuration.html) for 
information on Spark config
       </td>
     </tr>
     <tr>
    +  <td><code>spark.mem.max</code></td>
    +  <td><code>(none)</code></td>
    +  <td>
    +    Maximum amount of memory Spark accepts from Mesos launching executor.
    --- End diff --
    
    Maybe add "across the cluster (not from each machine)". And, something 
about there is no maximum if this property is not set.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to