Github user cloud-fan commented on the issue:
https://github.com/apache/spark/pull/21721
Note that, data source v2 API is not stable yet and we may even change the
abstraction of the APIs. The design of custom metrics may affect the design of
the streaming source APIs.
I had a hard time to figure out the life cycle of custom metrics. It seems
like its life cycle should be bound to an epoch, but unfortunately we don't
have such an interface in continuous streaming to represent an epoch. Is it
possible that we may end up with 2 sets of custom metrics APIs for micro-batch
and continuous? The documentation added in this PR is not clear about this.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]