Hi,

We are trying to move our existing code from spark dstreams to structured 
streaming for one of the old application which we built few years ago.

Structured streaming job doesn’t have streaming tab in sparkui. Is there a way 
to monitor the job submitted by us in structured streaming ? Since the job runs 
for every trigger, how can we kill the job and restart if needed. 

Any suggestions on this please 

Thanks,
Asmath



---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to