thanks Billie - do you think you could go into a little more detail about
the workflow DAG stuff on the wiki? it¹s a little cryptic (to me anyway)  :)

From:  Billie Rinaldi <[email protected]>
Reply-To:  <[email protected]>
Date:  Mon, 20 Jan 2014 07:40:04 -0800
To:  <[email protected]>
Subject:  Re: Jobs view .. how to hook into it....

In Hadoop 1 only, there is a log4j appender on the JobTracker/JobHistory
that inserts the data into postgres (or whichever db you have configured).
The code is in contrib/ambari-log4j.

Billie


On Fri, Jan 17, 2014 at 1:59 PM, Aaron Cody <[email protected]> wrote:
> hello
> I¹m looking at integrating my own process into the Ambari ŒJobs¹ view Š and I
> can see how the web side of things works .. i.e. the view makes REST calls to
> the server which in turn results in a query to postgres to get the job stats Š
> but what is not so clear is how those job/task stats get into postgres in the
> first placeŠ. 
> Q: for example, with MapReduce .. is Hadoop/JobTracker somehow inserting the
> job/task info into postgres directly? Or is there some other mechanism in
> Ambari that is listening for map reduce jobs/tasks to start/finish?
> 
> any hints on where to look in the source tree would be greatly appreciated
> TIA



Attachment: smime.p7s
Description: S/MIME cryptographic signature

Reply via email to