Hi Komal,

Since you use the Flink standalone deployment mode, the tasks of the jobs
which print information to the STDOUT may randomly deploy in any task
manager of the cluster. Did you check other Task Managers out file?

Best,
Vino

Komal Mariam <komal.mar...@gmail.com> 于2019年11月22日周五 下午6:59写道:

> Dear all,
>
> Thank you for your help regarding my previous queries. Unfortunately, I'm
> stuck with another one and will really appreciate your input.
>
> I can't seem to produce any outputs in "flink-taskexecutor-0.out" from my
> second job after submitting the first one in my 3-node-flink standalone
> cluster.
>
> Say I want to test out two jobs sequentially. (I do not want to run them
> concurrently/in parallel).
>
> After submitting "job1.jar " via command line, I press "Ctrl + C" to exit
> from it (as it runs infinitely). After that I
> try to submit a second jar file having the same properties (group-id,
> topic, etc) with the only difference being the query written in main
> function.
>
> The first job produces relevant outputs in "flink-taskexecutor-0.out" but
> the second one doesn't.
>
> The only way I can see the output produced is if I restart the cluster
> after job1 and then submit job2 as it produces another .out file.
>
> But I want to submit 2 jobs sequentially and see their outputs without
> having to restart my cluster. Is there any way to do this?
>
> Additional info:
> For both jobs I'm using DataStream API and I have set:
>  StreamExecutionEnvironment env =
> StreamExecutionEnvironment.getExecutionEnvironment();
>
> Best Regards,
> Komal
>

Reply via email to