Sorry, slightly misunderstood the question.  I'm not sure if there's a way
to make the master UI read old log files after a restart, but the log files
themselves are human readable text.

If you just want application duration, the start and stop are timestamped,
look for lines like this in EVENT_LOG_1:

{"Event":"SparkListenerApplicationStart","App
Name":"cassandra-example-broadcast-join","Timestamp":1415763986601,"User":"cody"}

...

{"Event":"SparkListenerApplicationEnd","Timestamp":1415763999790}



On Mon, Jan 12, 2015 at 3:56 PM, Chong Tang <[email protected]> wrote:

> Thank you, Cody! Actually, I have enabled this option, and I saved logs
> into Hadoop file system. The problem is, how can I get the duration of an
> application? The attached file is the log I copied from HDFS.
>
> On Mon, Jan 12, 2015 at 4:36 PM, Cody Koeninger <[email protected]>
> wrote:
>
>> http://spark.apache.org/docs/latest/monitoring.html
>>
>> http://spark.apache.org/docs/latest/configuration.html#spark-ui
>>
>> spark.eventLog.enabled
>>
>>
>>
>> On Mon, Jan 12, 2015 at 3:00 PM, ChongTang <[email protected]> wrote:
>>
>>> Is there any body can help me with this? Thank you very much!
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-recovery-application-running-records-when-I-restart-Spark-master-tp21088p21108.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: [email protected]
>>> For additional commands, e-mail: [email protected]
>>>
>>>
>>
>

Reply via email to