Github user markhamstra commented on a diff in the pull request:
https://github.com/apache/spark/pull/14737#discussion_r75602737
--- Diff: core/src/main/scala/org/apache/spark/ui/SparkUI.scala ---
@@ -141,6 +141,7 @@ private[spark] object SparkUI {
val DEFAULT_POOL_NAME = "default"
val DEFAULT_RETAINED_STAGES = 1000
val DEFAULT_RETAINED_JOBS = 1000
+ val DEFAULT_RETAINED_NODES = 2
--- End diff --
`NODES`, both here and in `spark.ui.retainedNodes` if far too ambiguous and
non-specific for this configuration value -- "node" is already overloaded too
many times in the existing Spark code and documentation; we don't need or want
to add another overload.
Additionally, the default behavior should be the same as current behavior,
since the change in behavior would be unexpected and it is far from clear to me
that the overwhelming majority of users would prefer the proposed new behavior.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]