Hi All,

I have an spark streaming application with batch (10 ms) which is reading
the MQTT channel and dumping the data from MQTT to HDFS.

So suppose if I have to deploy new application jar(with changes in spark
streaming application) what is the best way to deploy, currently I am doing
as below

1.killing the running streaming app using yarn application -kill ID
2. and then starting the application again

Problem with above approach is since we are not persisting the events in
MQTT we will miss the events for the period of deploy.

how to handle this case?

regards
jeeetndra

Reply via email to