Github user vanzin commented on a diff in the pull request:
https://github.com/apache/spark/pull/16714#discussion_r100918633
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/EventLoggingListener.scala ---
@@ -64,6 +64,12 @@ private[spark] class EventLoggingListener(
private val shouldOverwrite =
sparkConf.getBoolean("spark.eventLog.overwrite", false)
private val testing = sparkConf.getBoolean("spark.eventLog.testing",
false)
private val outputBufferSize =
sparkConf.getInt("spark.eventLog.buffer.kb", 100) * 1024
+ // To reduce the size of event logs, we can omit logging all of internal
accumulables for metrics.
+ private val omitInternalAccumulables =
+ sparkConf.getBoolean("spark.eventLog.omitInternalAccumulables", false)
+ // To reduce the size of event logs, we can omit logging "Updated Block
Statuses" metric.
+ private val omitUpdatedBlockStatuses =
+ sparkConf.getBoolean("spark.eventLog.omitUpdatedBlockStatuses", false)
--- End diff --
Similarly here.
A good way to measure "is it needed?" is to check whether the UI needs the
information. If it doesn't, then it's probably something we can live without.
If the UI uses it, the option should be documented and more properly
reflect what the user is giving up by disabling it (e.g.
"spark.eventLog.simplifiedStorageInfo" or something).
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]