I believe the OP is referring to the application UI on port 4040.

The application UI on port 4040 is available only while application is running. 
As per the documentation:
To view the web UI after the fact, set spark.eventLog.enabled to true before 
starting the application. This configures Spark to log Spark events that encode 
the information displayed in the UI to persisted storage.

Mohammed
Author: Big Data Analytics with 
Spark<http://www.amazon.com/Big-Data-Analytics-Spark-Practitioners/dp/1484209656/>

From: Shixiong(Ryan) Zhu [mailto:shixi...@databricks.com]
Sent: Monday, February 29, 2016 4:03 PM
To: Sumona Routh
Cc: user@spark.apache.org
Subject: Re: Spark UI standalone "crashes" after an application finishes

Do you mean you cannot access Master UI after your application completes? Could 
you check the master log?

On Mon, Feb 29, 2016 at 3:48 PM, Sumona Routh 
<sumos...@gmail.com<mailto:sumos...@gmail.com>> wrote:
Hi there,
I've been doing some performance tuning of our Spark application, which is 
using Spark 1.2.1 standalone. I have been using the spark metrics to graph out 
details as I run the jobs, as well as the UI to review the tasks and stages.
I notice that after my application completes, or is near completion, the UI 
"crashes." I get a Connection Refused response. Sometimes, the page eventually 
recovers and will load again, but sometimes I end up having to restart the 
Spark master to get it back. When I look at my graphs on the app, the memory 
consumption (of driver, executors, and what I believe to be the daemon 
(spark.jvm.total.used)) appears to be healthy. Monitoring the master machine 
itself, memory and CPU appear healthy as well.
Has anyone else seen this issue? Are there logs for the UI itself, and where 
might I find those?
Thanks!
Sumona

Reply via email to