Matt you should be able to set an HDFS path so you'll get logs written to a
unified place instead of to local disk on a random box on the cluster.

On Thu, Sep 25, 2014 at 1:38 PM, Matt Narrell <matt.narr...@gmail.com>
wrote:

> How does this work with a cluster manager like YARN?
>
> mn
>
> On Sep 25, 2014, at 2:23 PM, Andrew Or <and...@databricks.com> wrote:
>
> Hi Harsha,
>
> You can turn on `spark.eventLog.enabled` as documented here:
> http://spark.apache.org/docs/latest/monitoring.html. Then, if you are
> running standalone mode, you can access the finished SparkUI through the
> Master UI. Otherwise, you can start a HistoryServer to display finished UIs.
>
> -Andrew
>
> 2014-09-25 12:55 GMT-07:00 Harsha HN <99harsha.h....@gmail.com>:
>
>> Hi,
>>
>> Details laid out in Spark UI for the job in progress is really
>> interesting and very useful.
>> But this gets vanished once the job is done.
>> Is there a way to get job details post processing?
>>
>> Looking for Spark UI data, not standard input,output and error info.
>>
>> Thanks,
>> Harsha
>>
>
>
>

Reply via email to