Re: [Spark UI] Spark 2.3.1 UI no longer respects spark.ui.retainedJobs

2018-10-20 Thread Marcelo Vanzin
On Tue, Oct 16, 2018 at 9:34 AM Patrick Brown
 wrote:
> I recently upgraded to spark 2.3.1 I have had these same settings in my spark 
> submit script, which worked on 2.0.2, and according to the documentation 
> appear to not have changed:
>
> spark.ui.retainedTasks=1
> spark.ui.retainedStages=1
> spark.ui.retainedJobs=1

I tried that locally on the current master and it seems to be working.
I don't have 2.3 easily in front of me right now, but will take a look
Monday.

-- 
Marcelo

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Re: [Spark UI] Spark 2.3.1 UI no longer respects spark.ui.retainedJobs

2018-10-20 Thread Shing Hing Man
 I have the same problem when I upgrade my application from Spark 2.2.1 to 
Spark 2.3.2 and run in Yarn client mode.
Also I noticed that in my Spark driver,  org.apache.spark.status.TaskDataWrapper
could take up more than 2G of memory. 

Shing


On Tuesday, 16 October 2018, 17:34:02 GMT+1, Patrick Brown 
 wrote:  
 
 I recently upgraded to spark 2.3.1 I have had these same settings in my spark 
submit script, which worked on 2.0.2, and according to the documentation appear 
to not have changed:
spark.ui.retainedTasks=1spark.ui.retainedStages=1spark.ui.retainedJobs=1
However in 2.3.1 the UI doesn't seem to respect this, it still retains a huge 
number of jobs:



Is this a known issue? Any ideas?  
-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org