Github user arunmahadevan commented on the issue:
https://github.com/apache/spark/pull/21721
>It seems like its life cycle should be bound to an epoch, but
unfortunately we don't have such an interface in continuous streaming to
represent an epoch. Is it possible that we may end up with 2 sets of custom
metrics APIs for micro-batch and continuous?
@cloud-fan we could still report progress at the end of each epoch (e.g.
[here](https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/continuous/ContinuousExecution.scala#L231)
and via the EpochCordinator). There need not be separate interfaces for the
progress or the custom metrics, just the mechanisms could be different.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]