[ 
https://issues.apache.org/jira/browse/SPARK-25550?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen resolved SPARK-25550.
-------------------------------
    Resolution: Won't Fix

> [Spark Job History] Environment Page of Spark Job History UI  showing wrong 
> value for spark.ui.retainedJobs
> -----------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-25550
>                 URL: https://issues.apache.org/jira/browse/SPARK-25550
>             Project: Spark
>          Issue Type: Bug
>          Components: Web UI
>    Affects Versions: 2.3.1
>            Reporter: ABHISHEK KUMAR GUPTA
>            Priority: Major
>         Attachments: Screenshot from 2018-09-27 12-17-44.png, Screenshot from 
> 2018-09-27 12-19-05.png
>
>
> # spark.ui.retainedJobs =200( spark-default.conf of Job History)
> 2. Launch spark-shell --master yarn --conf spark.ui.retainedJobs=100
> 3. Execute Below command for creating 1000 jobs
> val rdd = sc.parallelize(1 to 5, 5)
> for(i <- 1 to 1000){
>  rdd.count
> }
> 4. Launch Job History Page
> 5. Click on corresponding spark-shell Application ID link in Job History and 
> Launch Job Page for the application
> 6. Go to Environment Page check the value against spark.ui.retainedJobs. It 
> shows spark.ui.retainedJobs = 100
> 7. Check the Total number of Jobs Summery Display in Job Page. It shows 
> Completed only 200 which is configured in spark-default.conf file of Job 
> History
> Actual Result: 
> Environment Page showing spark.ui.retainedJobs value which is set from 
> command prompt but Job Page displaying number of Jobs showing
> based on value set in spark-default.conf of Job History
> Expected Result:
> Value of spark.ui.retainedJobs in Environment Page should display  the value 
> which is configured in spark-default.conf of Job History



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to