Airblader commented on a change in pull request #17264:
URL: https://github.com/apache/flink/pull/17264#discussion_r711947608



##########
File path: 
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/connector/jdbc/dialect/AbstractDialect.java
##########
@@ -18,80 +18,252 @@
 
 package org.apache.flink.connector.jdbc.dialect;
 
-import org.apache.flink.table.api.TableSchema;
+import org.apache.flink.annotation.PublicEvolving;
 import org.apache.flink.table.api.ValidationException;
 import org.apache.flink.table.types.DataType;
 import org.apache.flink.table.types.logical.DecimalType;
 import org.apache.flink.table.types.logical.LogicalTypeRoot;
+import org.apache.flink.table.types.logical.RowType;
 import org.apache.flink.table.types.logical.TimestampType;
 import org.apache.flink.table.types.logical.VarBinaryType;
+import org.apache.flink.util.Preconditions;
 
-import java.util.List;
+import java.util.Arrays;
+import java.util.Optional;
+import java.util.Set;
+import java.util.stream.Collectors;
 
-abstract class AbstractDialect implements JdbcDialect {
+import static java.lang.String.format;
+
+/**
+ * Base class for {@link JdbcDialect JdbcDialects} that implements basic data 
type validation and
+ * the construction of basic {@code INSERT}, {@code UPDATE}, {@code DELETE}, 
and {@code SELECT}
+ * statements.
+ *
+ * <p>Implementors should be careful to check the default SQL statements are 
performant for their
+ * specific dialect and override them if necessary.
+ */
+@PublicEvolving
+public abstract class AbstractDialect implements JdbcDialect {

Review comment:
       While I like this better than it was originally now, I thought about 
this class a bit more (sorry). With the current changes, this class basically 
does two things now: 
   
   * It provides default implementations for most methods. Since these are 
generic and sane, just not necessarily optimized, they make a good fit for a 
base implementation IMO.
   * It performs some level of validation, but really it just implements 
`validate` and require to implement various other methods instead, essentially 
changing the way the interface works.
   
   Is there any actual benefit we get out of the latter point? It seems to be 
that we could just ditch `validate` and the abstract methods of this class and 
move them into the specific implementations instead without really creating any 
duplication or additional effort. That would turn this class into a base class 
which provides generic default implementations. WDYT?

##########
File path: 
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/connector/jdbc/dialect/AbstractDialect.java
##########
@@ -18,80 +18,252 @@
 
 package org.apache.flink.connector.jdbc.dialect;
 
-import org.apache.flink.table.api.TableSchema;
+import org.apache.flink.annotation.PublicEvolving;
 import org.apache.flink.table.api.ValidationException;
 import org.apache.flink.table.types.DataType;
 import org.apache.flink.table.types.logical.DecimalType;
 import org.apache.flink.table.types.logical.LogicalTypeRoot;
+import org.apache.flink.table.types.logical.RowType;
 import org.apache.flink.table.types.logical.TimestampType;
 import org.apache.flink.table.types.logical.VarBinaryType;
+import org.apache.flink.util.Preconditions;
 
-import java.util.List;
+import java.util.Arrays;
+import java.util.Optional;
+import java.util.Set;
+import java.util.stream.Collectors;
 
-abstract class AbstractDialect implements JdbcDialect {
+import static java.lang.String.format;
+
+/**
+ * Base class for {@link JdbcDialect JdbcDialects} that implements basic data 
type validation and
+ * the construction of basic {@code INSERT}, {@code UPDATE}, {@code DELETE}, 
and {@code SELECT}
+ * statements.
+ *
+ * <p>Implementors should be careful to check the default SQL statements are 
performant for their
+ * specific dialect and override them if necessary.
+ */
+@PublicEvolving
+public abstract class AbstractDialect implements JdbcDialect {
 
     @Override
-    public void validate(TableSchema schema) throws ValidationException {
-        for (int i = 0; i < schema.getFieldCount(); i++) {
-            DataType dt = schema.getFieldDataType(i).get();
-            String fieldName = schema.getFieldName(i).get();
+    public void validate(DataType dataType) throws ValidationException {
+        if (!(dataType.getLogicalType() instanceof RowType)) {
+            throw new ValidationException("Logical DataType must be a 
RowType");
+        }

Review comment:
       Since this always _has_ to be the row type, does it make sense to pass 
`RowType` directly in the interface and move this check to where we call the 
method?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to