For #1, click on a worker node on the YARN dashboard. From there,
Tools->Local logs->Userlogs has the logs for each application, and you can
view them by executor even while an application is running. (This is for
Hadoop 2.4, things may have changed in 2.6.)
-Sven

On Thu, Apr 23, 2015 at 6:27 AM, Ted Yu <yuzhih...@gmail.com> wrote:

> For step 2, you can pipe application log to a file instead of
> copy-pasting.
>
> Cheers
>
>
>
> On Apr 22, 2015, at 10:48 PM, ÐΞ€ρ@Ҝ (๏̯͡๏) <deepuj...@gmail.com> wrote:
>
> I submit a spark app to YARN and i get these messages
>
>  15/04/22 22:45:04 INFO yarn.Client: Application report for
> application_1429087638744_101363 (state: RUNNING)
>
> 15/04/22 22:45:04 INFO yarn.Client: Application report for
> application_1429087638744_101363 (state: RUNNING).
>
> ...
>
>
> 1) I can go to Spark UI and see the status of the APP but cannot see the
> logs as the job progresses. How can i see logs of executors as they
> progress ?
>
> 2) In case the App fails/completes then Spark UI vanishes and i get a YARN
> Job page that says job failed, there are no link to Spark UI again. Now i
> take the job ID and run yarn application logs appid and my console fills up
> (with huge scrolling) with logs of all executors. Then i copy and paste
> into a text editor and search for keywords "Exception" , "Job aborted due
> to ". Is this the right way to view logs ?
>
> --
> Deepak
>
>


-- 
www.skrasser.com <http://www.skrasser.com/?utm_source=sig>

Reply via email to