This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new ec9d8c0bbbca [SPARK-52808][CORE] `spark.history.retainedApplications` 
should be positive
ec9d8c0bbbca is described below

commit ec9d8c0bbbca4069e67207817256e62bdf606ba9
Author: Dongjoon Hyun <dongj...@apache.org>
AuthorDate: Tue Jul 15 15:51:58 2025 -0700

    [SPARK-52808][CORE] `spark.history.retainedApplications` should be positive
    
    ### What changes were proposed in this pull request?
    
    This PR aims to add a config value validation for 
`spark.history.retainedApplications`
    
    ### Why are the changes needed?
    
    To prevent a misleading situation from the beginning.
    
    For example, in case of `spark.history.retainedApplications=0`, Apache 
Spark shows a job list, but unable to show a job with `Too many redirects...` 
messages. Although this is not a Spark job, a user may be confused that this is 
a Spark bug.
    
    <img width="454" height="337" alt="Screenshot 2025-07-15 at 13 31 29" 
src="https://github.com/user-attachments/assets/f0b6e945-759e-4503-a716-69c49ad91a93";
 />
    
    <img width="587" height="149" alt="Screenshot 2025-07-15 at 13 34 23" 
src="https://github.com/user-attachments/assets/e49f70fd-708f-4d3a-afbc-e2b9b21dc3f3";
 />
    
    ### Does this PR introduce _any_ user-facing change?
    
    No. Previously, SHS didn't work correctly already.
    
    ### How was this patch tested?
    
    Pass the CIs.
    
    Since this is a check at SparkConf level, the failure happens before 
SparkHistory server is created. So, we cannot add a test case for this one to 
HistoryServerSuite.
    
    We can test it manually like the following. `SparkHistoryServer` stops at 
the beginning.
    ```
    25/07/15 13:43:00 INFO FsHistoryProvider: History server ui acls disabled; 
users with admin permissions: ; groups with admin permissions:
    Exception in thread "main" org.apache.spark.SparkIllegalArgumentException: 
[INVALID_CONF_VALUE.REQUIREMENT] The value '0' in the config 
"spark.history.retainedApplications" is invalid. The number of applications to 
retain should be a positive integer. SQLSTATE: 22022
    ```
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #51504 from dongjoon-hyun/SPARK-52808.
    
    Authored-by: Dongjoon Hyun <dongj...@apache.org>
    Signed-off-by: Dongjoon Hyun <dongj...@apache.org>
---
 core/src/main/scala/org/apache/spark/internal/config/History.scala | 1 +
 1 file changed, 1 insertion(+)

diff --git a/core/src/main/scala/org/apache/spark/internal/config/History.scala 
b/core/src/main/scala/org/apache/spark/internal/config/History.scala
index bbd4afcaebab..f30628130862 100644
--- a/core/src/main/scala/org/apache/spark/internal/config/History.scala
+++ b/core/src/main/scala/org/apache/spark/internal/config/History.scala
@@ -227,6 +227,7 @@ private[spark] object History {
       "exceeded, then the oldest applications will be removed from the cache. 
If an application " +
       "is not in the cache, it will have to be loaded from disk if it is 
accessed from the UI.")
     .intConf
+    .checkValue(v => v > 0, "The number of applications to retain should be a 
positive integer.")
     .createWithDefault(50)
 
   val PROVIDER = ConfigBuilder("spark.history.provider")


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to