Github user ankuriitg commented on a diff in the pull request:
https://github.com/apache/spark/pull/22504#discussion_r226421401
--- Diff: docs/configuration.md ---
@@ -266,6 +266,37 @@ of the most common options to set are:
Only has effect in Spark standalone mode or Mesos cluster deploy mode.
</td>
</tr>
+<tr>
+ <td><code>spark.driver.log.dfsDir</code></td>
+ <td>(none)</td>
+ <td>
+ Base directory in which Spark driver logs are synced, if
spark.driver.log.syncToDfs.enabled is true.
+ Within this base directory, Spark creates a sub-directory for each
application, and logs the driver logs
+ specific to the application in this directory. Users may want to set
this to a unified location like an
+ HDFS directory so driver log files can be persisted for later usage.
This directory should allow any spark
+ user to read/write files and the spark history server user to delete
files. Additionally, older logs from
+ this directory are cleaned by Spark History Server if
spark.history.fs.driverlog.cleaner.enabled is true.
+ They are cleaned if they are older than max age configured at
spark.history.fs.driverlog.cleaner.maxAge.
+ </td>
+</tr>
+<tr>
+ <td><code>spark.driver.log.syncToDfs.enabled</code></td>
+ <td>false</td>
+ <td>
+ If true, spark application running in client mode will sync driver
logs to a persistent storage, configured
--- End diff --
I guess, what I meant to say here is that this feature enables syncing to
dfs. We do not support syncing to local disk and that is just a implementation
detail, not a feature.
I am good with using any alternate wording, if that will represent the
intent better.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]