[
https://issues.apache.org/jira/browse/HADOOP-5846?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12710331#action_12710331
]
Steve Loughran commented on HADOOP-5846:
----------------------------------------
Even if the stuff goes to the local filesystem today, is it not possible to run
something after the work has completed (on the same machines as the log files)
to push those logs into the DFS filesystem, and hence into something that can
merge the logs off different machines into one continuous timeline (assuming
such a timeline exists and can be determined)?
> Log job history events to a common dump file
> --------------------------------------------
>
> Key: HADOOP-5846
> URL: https://issues.apache.org/jira/browse/HADOOP-5846
> Project: Hadoop Core
> Issue Type: New Feature
> Components: mapred
> Reporter: Amar Kamat
> Assignee: Amar Kamat
>
> As of today all the jobhistory events are logged to separate files. It would
> be nice to also dump all this info into a common file so that external tools
> (e.g Chukwa) can harvest history info. Job configuration should also be
> dumped. Whether to use a same log file for history dumps and configuration
> dumps should be configurable (by default everything goes to one file).
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.