wuchong commented on a change in pull request #10536: [FLINK-15191][connectors
/ kafka]Fix can't create table source for Kafka if watermark or computed column
is defined.
URL: https://github.com/apache/flink/pull/10536#discussion_r357501589
##########
File path:
flink-table/flink-table-api-java-bridge/src/main/java/org/apache/flink/table/descriptors/SchemaValidator.java
##########
@@ -118,6 +131,36 @@ else if (proctimeFound) {
properties.validateExclusion(proctime);
}
}
+
+ validateWatermark(properties);
+ }
+
+ /**
+ * Validate watermarks if exists.
+ */
+ private void validateWatermark(DescriptorProperties properties) {
+ final String schemaWatermarkKey = SCHEMA + "." + WATERMARK;
+ int watermarkRowtimeKeys =
properties.getIndexedProperty(schemaWatermarkKey, WATERMARK_ROWTIME).size();
+ int watermarkStrategyKeys =
properties.getIndexedProperty(schemaWatermarkKey,
WATERMARK_STRATEGY_EXPR).size();
+ int watermarkStrategyDataTypeKeys =
properties.getIndexedProperty(schemaWatermarkKey,
WATERMARK_STRATEGY_DATA_TYPE).size();
+
+ if (watermarkRowtimeKeys > 0 || watermarkStrategyKeys > 0 ||
watermarkStrategyDataTypeKeys > 0) {
+ if (!isStreamEnvironment) {
+ throw new ValidationException(
+ format("Property '%s' is not allowed in
a batch environment.", schemaWatermarkKey));
Review comment:
Please remove this limitation. DDL (or descriptor) should be unified, which
means the same DDL can both work in batch and streaming mode. The differences
is that watermark information will be ignored in batch mode.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services