Aren't these questions a little advanced for a bear to be asking? I'll be here all night...

But seriously, if your job is inherently recursive, one possible way to do it would be to make sure that you output in the same format that you input. Then you can keep re-reading the outputted file back into a new map/reduce job, until you hit some base case and you terminate. I've had a main method before that would kick off a bunch of jobs in a row -- but I wouldn't really recommend starting another map/reduce job in the scope of a running map() or reduce() method.

- David


On Oct 29, 2007, at 2:17 PM, Jim the Standing Bear wrote:

then

Reply via email to