>From within a Spark job you can use a Periodic Listener:
ssc.addStreamingListener(PeriodicStatisticsListener(Seconds(60)))
class PeriodicStatisticsListener(timePeriod: Duration) extends
StreamingListener {
private val logger = LoggerFactory.getLogger("Application")
override def onBatchComple
Apologize in advance if someone has already asked and addressed this
question.
In Spark Streaming, how can I programmatically get the batch statistics
like schedule delay, total delay and processing time (They are shown in the
job UI streaming tab)? I need such information to raise alerts in some