I've been dealing with a similar situation and I haven't found other
solution rather than launching two independent jobs (with a script or
whatever you like), letting the output of the first be the input of the
last. If you find any other option please let me know.

Regards


On 12 February 2014 12:55, Massimo Simoniello
<[email protected]>wrote:

> Hi,
>
> I'm using Hadoop Pipes and I want to chain two jobs (job1 and job2). Is it
> possible?
> I use the FileInputFormat.addInputPath()
> and FileOutputFormat.setOutputPath() functions to do it in Java, but I want
> to know if there is some way for do it in C++ with pipes.
>
> Thanks in advance,
>
> Massimo
>
>
>
>

Reply via email to