[
https://issues.apache.org/jira/browse/SPARK-1940?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16276759#comment-16276759
]
Apache Spark commented on SPARK-1940:
-------------------------------------
User 'tdas' has created a pull request for this issue:
https://github.com/apache/spark/pull/895
> Enable rolling of executor logs (stdout / stderr)
> -------------------------------------------------
>
> Key: SPARK-1940
> URL: https://issues.apache.org/jira/browse/SPARK-1940
> Project: Spark
> Issue Type: Improvement
> Components: Spark Core
> Reporter: Tathagata Das
> Assignee: Tathagata Das
> Fix For: 1.1.0
>
>
> Currently, in the default log4j configuration, all the executor logs get sent
> to the file <code>[executor-working-dir]/stderr</code>. This does not all log
> files to be rolled, so old logs cannot be removed.
> Using log4j RollingFileAppender allows log4j logs to be rolled, but all the
> logs get sent to a different set of files, other than the files
> <code>stdout</code> and <code>stderr</code> . So the logs are not visible in
> the Spark web UI any more as Spark web UI only reads the files
> <code>stdout</code> and <code>stderr</code>. Furthermore, it still does not
> allow the stdout and stderr to be cleared periodically in case a large amount
> of stuff gets written to them (e.g. by explicit println inside map function).
> Solving this requires rolling of the logs in such a way that Spark web UI is
> aware of it and can retrieve the logs across the rolled-over files.
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]