I'm gonna throw my vote in for a structured log format. Users could
tail it and use whatever queuing or monitoring they wish. It's also
probably just a 30 minute project for someone already familiar with
the code. I suggest ^A seperated key=value pairs per log line.8
Josh Ferguson
On Dec 8, 2008, at 1:09 PM, Ashish Thusoo <[EMAIL PROTECTED]> wrote:
Perhaps we should just have another log4j channel for this instead
of debug. The consumers can then just listen on this channel and
take appropriate action.
Another option would be to implement this using a message queue
(publish/subscribe system). We could leverage ActiveMQ or something
similar, but that would be a bit more heavyweight but potentially
people can develop or advanced monitoring applications around it.
Ashish
From: Joydeep Sen Sarma [mailto:[EMAIL PROTECTED]
Sent: Monday, December 08, 2008 12:17 PM
To: hive-user@hadoop.apache.org
Subject: RE: Hadoop JobStatus
The jobid is printed out for non-silent session execution mode.
Since there’s no structured interface – I had tried to have
structured data emitted as key=value in the output stream. The relev
ant output emitted here is from:
console.printInfo("Starting Job = " + rj.getJobID() + ",
Tracking URL = " + rj.getTrackingURL());
would really welcome a discussion on a better way to get structured
data out from the output.
From: Josh Ferguson [mailto:[EMAIL PROTECTED]
Sent: Monday, December 08, 2008 12:08 PM
To: hive-user@hadoop.apache.org
Subject: Hadoop JobStatus
When launching off hive queries using hive -e is there a way to get
the job id so that I can just queue them up and go check their
statuses later? What's the general pattern for queueing and
monitoring without using the libraries directly?
Josh Ferguson