The Spark history server does not have the ability to serve executor
logs currently. You need to use the "yarn logs" command for that.

On Tue, Apr 7, 2015 at 2:51 AM, donhoff_h <165612...@qq.com> wrote:
> Hi, Experts
>
> I run my Spark Cluster on Yarn. I used to get executors' Logs from Spark's
> History Server. But after I started my Hadoop jobhistory server and made
> configuration to aggregate logs of hadoop jobs to a HDFS directory, I found
> that I could not get spark's executors' Logs any more. Is there any solution
> so that I could get logs of my spark jobs from Spark History Server and get
> logs of my map-reduce jobs from Hadoop History Server? Many Thanks!
>
> Following is the configuration I made in Hadoop yarn-site.xml
> yarn.log-aggregation-enable=true
> yarn.nodemanager.remote-app-log-dir=/mr-history/agg-logs
> yarn.log-aggregation.retain-seconds=259200
> yarn.log-aggregation.retain-check-interval-seconds=-1‍



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to