Hi, Is there any way I can add/delete actions/jobs dynamically in a running spark streaming job. I will call an API and execute only the configured actions in the system.
Eg . In the first batch suppose there are 5 actions in the spark application. Now suppose some configuration is changed and one action is added and one is deleted. How can i handle this in the spark streaming job without restarting the application