Re: Spark UI standalone "crashes" after an application finishes

2016-03-01 Thread Gourav Sengupta
Hi Teng,

I was not asking the question, I was responding in terms of what to expect
from SPARK UI in terms of how you start using SPARK application.

Thanks and Regards,
Gourav

On Tue, Mar 1, 2016 at 8:30 PM, Teng Qiu  wrote:

> as Gourav said, the application UI on port 4040 will no more available
> after your spark app finished. you should go to spark master's UI
> (port 8080), and take a look "completed applications"...
>
> refer to doc: http://spark.apache.org/docs/latest/monitoring.html
> read the first "note that" :)
>
> 2016-03-01 21:13 GMT+01:00 Gourav Sengupta :
> > Hi,
> >
> > in case you are submitting your SPARK jobs then the UI is only available
> > when the job is running.
> >
> > Else if you are starting a SPARK cluster in standalone mode or HADOOP or
> > etc, then the SPARK UI remains alive.
> >
> > The other way to keep the SPARK UI alive is to use the Jupyter notebook
> for
> > Python or Scala (see Apache Toree) or use Zeppelin.
> >
> >
> > Regards,
> > Gourav Sengupta
> >
> > On Mon, Feb 29, 2016 at 11:48 PM, Sumona Routh 
> wrote:
> >>
> >> Hi there,
> >> I've been doing some performance tuning of our Spark application, which
> is
> >> using Spark 1.2.1 standalone. I have been using the spark metrics to
> graph
> >> out details as I run the jobs, as well as the UI to review the tasks and
> >> stages.
> >>
> >> I notice that after my application completes, or is near completion, the
> >> UI "crashes." I get a Connection Refused response. Sometimes, the page
> >> eventually recovers and will load again, but sometimes I end up having
> to
> >> restart the Spark master to get it back. When I look at my graphs on the
> >> app, the memory consumption (of driver, executors, and what I believe
> to be
> >> the daemon (spark.jvm.total.used)) appears to be healthy. Monitoring the
> >> master machine itself, memory and CPU appear healthy as well.
> >>
> >> Has anyone else seen this issue? Are there logs for the UI itself, and
> >> where might I find those?
> >>
> >> Thanks!
> >> Sumona
> >
> >
>


Re: Spark UI standalone "crashes" after an application finishes

2016-03-01 Thread Teng Qiu
as Gourav said, the application UI on port 4040 will no more available
after your spark app finished. you should go to spark master's UI
(port 8080), and take a look "completed applications"...

refer to doc: http://spark.apache.org/docs/latest/monitoring.html
read the first "note that" :)

2016-03-01 21:13 GMT+01:00 Gourav Sengupta :
> Hi,
>
> in case you are submitting your SPARK jobs then the UI is only available
> when the job is running.
>
> Else if you are starting a SPARK cluster in standalone mode or HADOOP or
> etc, then the SPARK UI remains alive.
>
> The other way to keep the SPARK UI alive is to use the Jupyter notebook for
> Python or Scala (see Apache Toree) or use Zeppelin.
>
>
> Regards,
> Gourav Sengupta
>
> On Mon, Feb 29, 2016 at 11:48 PM, Sumona Routh  wrote:
>>
>> Hi there,
>> I've been doing some performance tuning of our Spark application, which is
>> using Spark 1.2.1 standalone. I have been using the spark metrics to graph
>> out details as I run the jobs, as well as the UI to review the tasks and
>> stages.
>>
>> I notice that after my application completes, or is near completion, the
>> UI "crashes." I get a Connection Refused response. Sometimes, the page
>> eventually recovers and will load again, but sometimes I end up having to
>> restart the Spark master to get it back. When I look at my graphs on the
>> app, the memory consumption (of driver, executors, and what I believe to be
>> the daemon (spark.jvm.total.used)) appears to be healthy. Monitoring the
>> master machine itself, memory and CPU appear healthy as well.
>>
>> Has anyone else seen this issue? Are there logs for the UI itself, and
>> where might I find those?
>>
>> Thanks!
>> Sumona
>
>

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Spark UI standalone "crashes" after an application finishes

2016-03-01 Thread Gourav Sengupta
Hi,

in case you are submitting your SPARK jobs then the UI is only available
when the job is running.

Else if you are starting a SPARK cluster in standalone mode or HADOOP or
etc, then the SPARK UI remains alive.

The other way to keep the SPARK UI alive is to use the Jupyter notebook for
Python or Scala (see Apache Toree) or use Zeppelin.


Regards,
Gourav Sengupta

On Mon, Feb 29, 2016 at 11:48 PM, Sumona Routh  wrote:

> Hi there,
> I've been doing some performance tuning of our Spark application, which is
> using Spark 1.2.1 standalone. I have been using the spark metrics to graph
> out details as I run the jobs, as well as the UI to review the tasks and
> stages.
>
> I notice that after my application completes, or is near completion, the
> UI "crashes." I get a Connection Refused response. Sometimes, the page
> eventually recovers and will load again, but sometimes I end up having to
> restart the Spark master to get it back. When I look at my graphs on the
> app, the memory consumption (of driver, executors, and what I believe to be
> the daemon (spark.jvm.total.used)) appears to be healthy. Monitoring the
> master machine itself, memory and CPU appear healthy as well.
>
> Has anyone else seen this issue? Are there logs for the UI itself, and
> where might I find those?
>
> Thanks!
> Sumona
>


Re: Spark UI standalone "crashes" after an application finishes

2016-03-01 Thread Sumona Routh
Thanks Shixiong!
To clarify for others, yes, I was speaking of the UI at port 4040, and I do
have event logging enabled, so I can review jobs after the fact. We hope to
upgrade our version of Spark soon, so I'll write back if that resolves it.

Sumona

On Mon, Feb 29, 2016 at 8:27 PM Sea <261810...@qq.com> wrote:

> Hi, Sumona:
>   It's a bug in Spark old version, In spark 1.6.0, it is fixed.
>   After the application complete, spark master will load event log to
> memory, and it is sync because of actor. If the event log is big, spark
> master will hang a long time, and you can not submit any applications, if
> your master memory is to small, you master will die!
>   The solution in spark 1.6 is not very good, the operation is async
> <https://www.baidu.com/link?url=x_WhMZLHfNnhHGknDAZ8Ssl9f7YlEQAvUgpLAGz6cI045umWecBzzh0ho-QkCr2nKnHOPJxIX5_n_zXe51k8z9hVuw4svP6dqWF0JrjabAa==be50a4160f49000256d50b7b>,
> and so you still need to set a big java heap for master.
>
>
>
> -- 原始邮件 --
> *发件人:* "Shixiong(Ryan) Zhu";<shixi...@databricks.com>;
> *发送时间:* 2016年3月1日(星期二) 上午8:02
> *收件人:* "Sumona Routh"<sumos...@gmail.com>;
> *抄送:* "user@spark.apache.org"<user@spark.apache.org>;
> *主题:* Re: Spark UI standalone "crashes" after an application finishes
>
> Do you mean you cannot access Master UI after your application completes?
> Could you check the master log?
>
> On Mon, Feb 29, 2016 at 3:48 PM, Sumona Routh <sumos...@gmail.com> wrote:
>
>> Hi there,
>> I've been doing some performance tuning of our Spark application, which
>> is using Spark 1.2.1 standalone. I have been using the spark metrics to
>> graph out details as I run the jobs, as well as the UI to review the tasks
>> and stages.
>>
>> I notice that after my application completes, or is near completion, the
>> UI "crashes." I get a Connection Refused response. Sometimes, the page
>> eventually recovers and will load again, but sometimes I end up having to
>> restart the Spark master to get it back. When I look at my graphs on the
>> app, the memory consumption (of driver, executors, and what I believe to be
>> the daemon (spark.jvm.total.used)) appears to be healthy. Monitoring the
>> master machine itself, memory and CPU appear healthy as well.
>>
>> Has anyone else seen this issue? Are there logs for the UI itself, and
>> where might I find those?
>>
>> Thanks!
>> Sumona
>>
>
>


RE: Spark UI standalone "crashes" after an application finishes

2016-02-29 Thread Mohammed Guller
I believe the OP is referring to the application UI on port 4040.

The application UI on port 4040 is available only while application is running. 
As per the documentation:
To view the web UI after the fact, set spark.eventLog.enabled to true before 
starting the application. This configures Spark to log Spark events that encode 
the information displayed in the UI to persisted storage.

Mohammed
Author: Big Data Analytics with 
Spark<http://www.amazon.com/Big-Data-Analytics-Spark-Practitioners/dp/1484209656/>

From: Shixiong(Ryan) Zhu [mailto:shixi...@databricks.com]
Sent: Monday, February 29, 2016 4:03 PM
To: Sumona Routh
Cc: user@spark.apache.org
Subject: Re: Spark UI standalone "crashes" after an application finishes

Do you mean you cannot access Master UI after your application completes? Could 
you check the master log?

On Mon, Feb 29, 2016 at 3:48 PM, Sumona Routh 
<sumos...@gmail.com<mailto:sumos...@gmail.com>> wrote:
Hi there,
I've been doing some performance tuning of our Spark application, which is 
using Spark 1.2.1 standalone. I have been using the spark metrics to graph out 
details as I run the jobs, as well as the UI to review the tasks and stages.
I notice that after my application completes, or is near completion, the UI 
"crashes." I get a Connection Refused response. Sometimes, the page eventually 
recovers and will load again, but sometimes I end up having to restart the 
Spark master to get it back. When I look at my graphs on the app, the memory 
consumption (of driver, executors, and what I believe to be the daemon 
(spark.jvm.total.used)) appears to be healthy. Monitoring the master machine 
itself, memory and CPU appear healthy as well.
Has anyone else seen this issue? Are there logs for the UI itself, and where 
might I find those?
Thanks!
Sumona



?????? Spark UI standalone "crashes" after an application finishes

2016-02-29 Thread Sea
Hi, Sumona:
  It's a bug in Spark old version, In spark 1.6.0, it is fixed.
  After the application complete, spark master will load event log to 
memory, and it is sync because of actor. If the event log is big, spark master 
will hang a long time, and you can not submit any applications, if your master 
memory is to small, you master will die!
  The solution in spark 1.6 is not very good, the operation is async, and 
so you still need to set a big java heap for master.






--  --
??: "Shixiong(Ryan) Zhu";<shixi...@databricks.com>;
: 2016??3??1??(??) 8:02
??: "Sumona Routh"<sumos...@gmail.com>; 
: "user@spark.apache.org"<user@spark.apache.org>; 
????: Re: Spark UI standalone "crashes" after an application finishes



Do you mean you cannot access Master UI after your application completes? Could 
you check the master log?

On Mon, Feb 29, 2016 at 3:48 PM, Sumona Routh <sumos...@gmail.com> wrote:
Hi there,

I've been doing some performance tuning of our Spark application, which is 
using Spark 1.2.1 standalone. I have been using the spark metrics to graph out 
details as I run the jobs, as well as the UI to review the tasks and stages.


I notice that after my application completes, or is near completion, the UI 
"crashes." I get a Connection Refused response. Sometimes, the page eventually 
recovers and will load again, but sometimes I end up having to restart the 
Spark master to get it back. When I look at my graphs on the app, the memory 
consumption (of driver, executors, and what I believe to be the daemon 
(spark.jvm.total.used)) appears to be healthy. Monitoring the master machine 
itself, memory and CPU appear healthy as well.


Has anyone else seen this issue? Are there logs for the UI itself, and where 
might I find those?


Thanks!

Sumona

Re: Spark UI standalone "crashes" after an application finishes

2016-02-29 Thread Shixiong(Ryan) Zhu
Do you mean you cannot access Master UI after your application completes?
Could you check the master log?

On Mon, Feb 29, 2016 at 3:48 PM, Sumona Routh  wrote:

> Hi there,
> I've been doing some performance tuning of our Spark application, which is
> using Spark 1.2.1 standalone. I have been using the spark metrics to graph
> out details as I run the jobs, as well as the UI to review the tasks and
> stages.
>
> I notice that after my application completes, or is near completion, the
> UI "crashes." I get a Connection Refused response. Sometimes, the page
> eventually recovers and will load again, but sometimes I end up having to
> restart the Spark master to get it back. When I look at my graphs on the
> app, the memory consumption (of driver, executors, and what I believe to be
> the daemon (spark.jvm.total.used)) appears to be healthy. Monitoring the
> master machine itself, memory and CPU appear healthy as well.
>
> Has anyone else seen this issue? Are there logs for the UI itself, and
> where might I find those?
>
> Thanks!
> Sumona
>


Spark UI standalone "crashes" after an application finishes

2016-02-29 Thread Sumona Routh
Hi there,
I've been doing some performance tuning of our Spark application, which is
using Spark 1.2.1 standalone. I have been using the spark metrics to graph
out details as I run the jobs, as well as the UI to review the tasks and
stages.

I notice that after my application completes, or is near completion, the UI
"crashes." I get a Connection Refused response. Sometimes, the page
eventually recovers and will load again, but sometimes I end up having to
restart the Spark master to get it back. When I look at my graphs on the
app, the memory consumption (of driver, executors, and what I believe to be
the daemon (spark.jvm.total.used)) appears to be healthy. Monitoring the
master machine itself, memory and CPU appear healthy as well.

Has anyone else seen this issue? Are there logs for the UI itself, and
where might I find those?

Thanks!
Sumona