I'm running Hive queries that use python scripts for the reducer.  This method 
seems to act like streaming in all conventional ways (namely, output must be 
sent to stdout and ancillary output can be sent to stderr).

I am not seeing status updates or counter updates however.  I'm doing it the 
usual way, by sending messages of the following forms to stderr:

reporter:counter:foo:bar:123
reporter:status:hello

I'm also attempting to send keep alives:
report:status:keep_alive

I can see all of these messages appearing in stderr, but they don't show up in 
the expected places in the job tracker...and I have no idea if the keep alives 
are "working" either.

With regard to counters, I'm not sure if I need to initialize or notify hadoop 
of the group/counter ids in advance, but with regard to status it should be 
very straight-forward, and I'm not seeing anything.  The status jus says 
"reduce > reduce", as always.

Thanks for any help on this.

________________________________________________________________________________
Keith Wiley     [email protected]     keithwiley.com    music.keithwiley.com

"And what if we picked the wrong religion?  Every week, we're just making God
madder and madder!"
                                           --  Homer Simpson
________________________________________________________________________________

Reply via email to