Github user tgravescs commented on a diff in the pull request:
https://github.com/apache/spark/pull/22504#discussion_r227036915
--- Diff: docs/configuration.md ---
@@ -266,6 +266,41 @@ of the most common options to set are:
Only has effect in Spark standalone mode or Mesos cluster deploy mode.
</td>
</tr>
+<tr>
+ <td><code>spark.driver.log.dfsDir</code></td>
+ <td>(none)</td>
+ <td>
+ Base directory in which Spark driver logs are synced, if
<code>spark.driver.log.persistToDfs.enabled</code>
+ is true. Within this base directory, Spark creates a sub-directory for
each application, and logs the driver
+ logs specific to the application in this directory. Users may want to
set this to a unified location like an
+ HDFS directory so driver log files can be persisted for later usage.
This directory should allow any Spark
+ user to read/write files and the Spark History Server user to delete
files. Additionally, older logs from
--- End diff --
we should add something about this to the security doc with specific
information on permissions, like for event logging:
https://spark.apache.org/docs/latest/security.html
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]