That was something I used to do with hadoop and it's comfortable when
testing stuff (so it is not so important).
For an example see what happens when you run the old "hadoop jar
hadoop-mapreduce-examples.jar" command..it "drives" you to the correct
invokation of that job.
However, the important thing is that I'd like to keep existing related jobs
somewhere (like a repository of jobs), deploy them and then be able to
start the one I need from an external program.

Could this be done with RemoteExecutor? Or is there any WS to manage the
job execution? That would be very useful..
Is the Client interface the only one that allow something similar right now?

On Fri, Nov 21, 2014 at 6:19 PM, Stephan Ewen <[email protected]> wrote:

> I am not sure exactly what you need there. In Flink you can write more
> than one program in the same program ;-) You can define complex flows and
> execute arbitrarily at intermediate points:
>
> main() {
>   ExecutionEnvironment env = ...;
>
>   env.readSomething().map().join(...).and().so().on();
>   env.execute();
>
>   env.readTheNextThing().do()Something();
>   env.execute();
> }
>
>
> You can also just "save" a program and keep it for later execution:
>
> Plan plan = env.createProgramPlan();
>
> at a later point you can start that plan: new RemoteExecutor(master,
> 6123).execute(plan);
>
>
>
> Stephan
>
>
>
> On Fri, Nov 21, 2014 at 5:49 PM, Flavio Pompermaier <[email protected]>
> wrote:
>
>> Any help on this? :(
>>
>> On Fri, Nov 21, 2014 at 9:33 AM, Flavio Pompermaier <[email protected]
>> > wrote:
>>
>>> Hi guys,
>>> I forgot to ask you if there's a Flink utility to simulate the Hadoop
>>> ProgramDriver class that acts somehow like a registry of jobs. Is there
>>> something similar?
>>>
>>> Best,
>>> Flavio
>>>
>>

Reply via email to