HeartSaVioR commented on a change in pull request #24996: [SPARK-28199][SS]
Move Trigger implementations to Triggers.scala and avoid exposing these to the
end users
URL: https://github.com/apache/spark/pull/24996#discussion_r302197454
##########
File path:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/Triggers.scala
##########
@@ -17,13 +17,94 @@
package org.apache.spark.sql.execution.streaming
+import java.util.concurrent.TimeUnit
+
+import scala.concurrent.duration.Duration
+
import org.apache.spark.annotation.{Evolving, Experimental}
import org.apache.spark.sql.streaming.Trigger
+import org.apache.spark.unsafe.types.CalendarInterval
+
+private[sql] object TriggerIntervalUtils {
Review comment:
Yeah, renaming is not a big deal. I'll do it once we have decision on below.
Regarding necessity of this object, it abstracts `interval` to provide
unified way to convert any kind of units what we support, to the internal unit
of interval. If we want to support other kinds of interval we can add it here,
and any new trigger implementations which leverages interval should have
benefit on this.
I also hope if we can avoid having same kind of methods among Trigger,
companion object of each implementation, and this object. I thought it is just
simpler to have this as utility object, but if we want better shape I'll see
whether it can be, maybe `trait` would work?
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]