Hello
I'm running a job with "flink run -p5" and additionally set 
env.setParallelism(5).
The source of the stream is Kafka, the job uses FlinkKafkaConsumer010.
In Flink UI though I notice that if I send 3 documents to Kafka, only one 
'instance' of the consumer seems to receive Kafka's record and send them to 
next operators, which according to Flink UI are properly parallelized.
What's the explanation of this behavior? 
According to sources:

To enable parallel execution, the user defined source should
     * implement {@link 
org.apache.flink.streaming.api.functions.source.ParallelSourceFunction} or 
extend {@link
     * 
org.apache.flink.streaming.api.functions.source.RichParallelSourceFunction}
which FlinkKafkaConsumer010 does

Please check a screenshot at https://imgur.com/a/E1H9r  you'll see that only 
one sends 3 records to the sinks

My code is here: https://pastebin.com/yjYCXAAR

Thanks!

Reply via email to