Hi,

When doing application upgrade for spark structured streaming, do we need to 
delete the checkpoint or does it start consuming offsets from the point we left?

kafka source we need to use the option "StartingOffsets" with a json string like

""" {"topicA":{"0":23,"1":-1},"topicB":{"0":-2}} """
Do we need to use as above when application is restarted by storing offsets at 
some place ? 

Thanks,
Asmath

Sent from my iPhone

Reply via email to