I am looking for tips on evaluating my Spark job after it has run.

I know that right now I can look at the history of jobs through the web ui.
I also know how to look at the current resources being used by a similar
web ui.

However, I would like to look at the logs after the job is finished to
evaluate such things as how many tasks were completed, how many executors
were used, etc. I currently save my logs to S3.

Thanks!

Henry

-- 
Paul Henry Tremblay
Robert Half Technology

Reply via email to