srowen commented on a change in pull request #23664: [MINOR][DOCS] Add a note 
that 'spark.executor.pyspark.memory' is dependent on 'resource'
URL: https://github.com/apache/spark/pull/23664#discussion_r251238099
 
 

 ##########
 File path: docs/configuration.md
 ##########
 @@ -223,7 +225,8 @@ of the most common options to set are:
     stored on disk. This should be on a fast, local disk in your system. It 
can also be a
     comma-separated list of multiple directories on different disks.
 
-    NOTE: In Spark 1.0 and later this will be overridden by SPARK_LOCAL_DIRS 
(Standalone), MESOS_SANDBOX (Mesos) or
+    <br/>
+    <em>Note:</em> In Spark 1.0 and later this will be overridden by 
SPARK_LOCAL_DIRS (Standalone), MESOS_SANDBOX (Mesos) or
 
 Review comment:
   We could probably remove this or just remove "In Spark 1.0 and later"

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to