Hi Piotrek,
I was checking in Job manager machine logs, and dashboard. But actually
output string was recorded in taskmanager macine log file. I added InfluxDB
and verified, Received data is writing into influxDB.
Thank you very much for your support.
Thanks,
Shankara
--
Sent from:
Hi,
What version of Flink are you using. In earlier 1.3.x releases there were some
bugs in Kafka Consumer code.
Could you change the log level in Flink to debug?
Did you check the Kafka logs for some hint maybe?
I guess that metrics like bytes read/input records of this Link application are
Hi,
I mean same code works fine in flink local setup. I can able to see
"Received Message from testkafka Topic : " on console when kafka
receive some message (Kafka Producer is in other machine and sending some
message frequently to testkafka topic).
*Submitted the Beam
Hi,
What do you mean by:
> With standalone beam application kafka can receive the message, But in
cluster setup it is not working.
In your example you are reading the data from Kafka and printing them to
console. There doesn’t seems to be anything that writes back to Kafka, so what
do you
Below is my setup
1. Kafka zookeeper and server in one machine (192.168.1.116) and
producer (192.168.1.100) and consumer (192.168.1.117) in another machine.
--> This work fine no issue
2. Running standalone beam application with kafka consumer --> This
work fine
3.