Mark N wrote:
Currently  I was trying to read the log files generated by hadoop so that I
could show the status of map/reduce jobs in user interfaces
(consider that UI is a separate application )

I was looking at JobTracker APIs and was trying use following APIs
 1. jobtracker.getAllJobs()  which returns array of jobstatus objects

 2. Then for each jobStatus we can call
    mapProgress() , isJobCompleted()  methods

The problem is how do I link existing jobTracker ( which is already running
)  with my  code ? i am trying to built something like  " hadoop UI ."

should I use jobClient  APIs ? Also is this a proper approach ?

thanks in advance . N mark.


jobclient API is full featured, but brittle against versions. If you look at Hadoop Studio, they apparently have added a plug-in layer so they can work with different versions of hadoop more gracefully -you may want to use their code: http://www.hadoopstudio.org/

Some of the JSP pages have status, there's a new one that pushes some XML content out, but nobody has yet sat down and written a stable, secure long-haul interface to the job tracker.

I've talked about it, got some ideas, but not started coding a proper RESTful front end
http://www.slideshare.net/steve_l/long-haul-hadoop

-steve

Reply via email to