fedimser commented on code in PR #54254:
URL: https://github.com/apache/spark/pull/54254#discussion_r2819346989
##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##########
@@ -4019,6 +4020,24 @@ object CleanupAliases extends Rule[LogicalPlan] with
AliasHelper {
}
}
+/**
+ * Validates that the event time column in EventTimeWatermark is a top-level
column reference
+ * (e.g. a single name), not a nested field (e.g. "struct_col.field").
+ */
+object ValidateEventTimeWatermarkColumn extends Rule[LogicalPlan] {
+ override def apply(plan: LogicalPlan): LogicalPlan =
plan.resolveOperatorsWithPruning(
+ _.containsPattern(EVENT_TIME_WATERMARK)) {
+ case etw: EventTimeWatermark =>
+ etw.eventTime match {
+ case u: UnresolvedAttribute if u.nameParts.length > 1 =>
Review Comment:
You are correct, this check would have disallow using aliases in eventTime.
I fixed it by resolving alias and checking whether it resolves to Attribute.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]