sjwiesman commented on a change in pull request #17264:
URL: https://github.com/apache/flink/pull/17264#discussion_r712274377



##########
File path: 
flink-connectors/flink-connector-jdbc/src/main/java/org/apache/flink/connector/jdbc/dialect/AbstractDialect.java
##########
@@ -18,80 +18,252 @@
 
 package org.apache.flink.connector.jdbc.dialect;
 
-import org.apache.flink.table.api.TableSchema;
+import org.apache.flink.annotation.PublicEvolving;
 import org.apache.flink.table.api.ValidationException;
 import org.apache.flink.table.types.DataType;
 import org.apache.flink.table.types.logical.DecimalType;
 import org.apache.flink.table.types.logical.LogicalTypeRoot;
+import org.apache.flink.table.types.logical.RowType;
 import org.apache.flink.table.types.logical.TimestampType;
 import org.apache.flink.table.types.logical.VarBinaryType;
+import org.apache.flink.util.Preconditions;
 
-import java.util.List;
+import java.util.Arrays;
+import java.util.Optional;
+import java.util.Set;
+import java.util.stream.Collectors;
 
-abstract class AbstractDialect implements JdbcDialect {
+import static java.lang.String.format;
+
+/**
+ * Base class for {@link JdbcDialect JdbcDialects} that implements basic data 
type validation and
+ * the construction of basic {@code INSERT}, {@code UPDATE}, {@code DELETE}, 
and {@code SELECT}
+ * statements.
+ *
+ * <p>Implementors should be careful to check the default SQL statements are 
performant for their
+ * specific dialect and override them if necessary.
+ */
+@PublicEvolving
+public abstract class AbstractDialect implements JdbcDialect {

Review comment:
       I understand you are not saying get rid of `validate`.
   
   I am trying to come up with a trade-off between expressibility and 
usability. I'd also like to basically ignore the fact we have some dialects 
implemented in tree and think about someone building a custom dialect.  
   
   The average dialect implementor is not going to have a lot of Flink 
experience. This is probably their first Flink application and they are someone 
who has a product requirement to get something working with their DB. What we 
want is to ensure they are successful so they continue using Flink and take it 
to production. 
   
   Having this default implementation significantly lowers the barrier to 
entry. The user lists the data types supported by their DB and gets something 
sane out of the box. We are on the same page that it does not work generically 
for all databases, I think it works well enough that the improved user 
experience outweighs the costs. 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to