Github user tgravescs commented on a diff in the pull request:
https://github.com/apache/spark/pull/14673#discussion_r75119974
--- Diff: core/src/main/scala/org/apache/spark/ui/SparkUI.scala ---
@@ -141,6 +141,7 @@ private[spark] object SparkUI {
val DEFAULT_POOL_NAME = "default"
val DEFAULT_RETAINED_STAGES = 1000
val DEFAULT_RETAINED_JOBS = 1000
+ val DEFAULT_RETAINED_TASKS = 10000
--- End diff --
do we have an idea of when people are hitting this? Personally I find
10000 small also, I was thinking more like 50000 or 100000 and users can limit
further if they have the history server configured to use small amounts of
memory.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]