fsk119 commented on code in PR #24183:
URL: https://github.com/apache/flink/pull/24183#discussion_r1468800471


##########
flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/extraction/FunctionSignatureTemplate.java:
##########
@@ -44,19 +45,24 @@ final class FunctionSignatureTemplate {
 
     final @Nullable String[] argumentNames;
 
+    final @Nullable Boolean[] argumentOptionals;

Review Comment:
   Please also modify the hashcode and equals method when add a new member 
field.



##########
flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/extraction/BaseMappingExtractor.java:
##########
@@ -323,6 +327,37 @@ private void verifyMappingForMethod(
                         verification.verify(method, signature.toClass(), 
result.toClass()));
     }
 
+    private void verifyOptionalOnPrimitiveParameter(
+            Method method,
+            Map<FunctionSignatureTemplate, FunctionResultTemplate> 
collectedMappingsPerMethod) {
+
+        collectedMappingsPerMethod.forEach(
+                (signature, result) -> {
+                    Boolean[] argumentOptional = signature.argumentOptionals;
+                    if (argumentOptional != null
+                            && 
Arrays.stream(argumentOptional).anyMatch(Boolean::booleanValue)) {
+                        // do something check
+                        FunctionSignatureTemplate functionResultTemplate =
+                                signatureExtraction.extract(this, method);
+                        for (int i = 0; i < argumentOptional.length; i++) {
+                            Class<?> converionClass =
+                                    functionResultTemplate
+                                            .argumentTemplates
+                                            .get(i)
+                                            .dataType
+                                            .getConversionClass();
+                            if (argumentOptional[i]
+                                    && converionClass != null
+                                    && converionClass.isPrimitive()) {
+                                throw extractionError(
+                                        "Argument at position %d is optional 
but a primitive type parameter.",

Review Comment:
   Argument at position %d is optional but a primitive type doesn't accept null 
value.



##########
flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/extraction/BaseMappingExtractor.java:
##########
@@ -323,6 +327,37 @@ private void verifyMappingForMethod(
                         verification.verify(method, signature.toClass(), 
result.toClass()));
     }
 
+    private void verifyOptionalOnPrimitiveParameter(
+            Method method,
+            Map<FunctionSignatureTemplate, FunctionResultTemplate> 
collectedMappingsPerMethod) {
+

Review Comment:
   nit: remove this empty line.
   
   It's better we can keep the same code style.



##########
flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/extraction/BaseMappingExtractor.java:
##########
@@ -323,6 +327,37 @@ private void verifyMappingForMethod(
                         verification.verify(method, signature.toClass(), 
result.toClass()));
     }
 
+    private void verifyOptionalOnPrimitiveParameter(
+            Method method,
+            Map<FunctionSignatureTemplate, FunctionResultTemplate> 
collectedMappingsPerMethod) {
+
+        collectedMappingsPerMethod.forEach(
+                (signature, result) -> {
+                    Boolean[] argumentOptional = signature.argumentOptionals;
+                    if (argumentOptional != null
+                            && 
Arrays.stream(argumentOptional).anyMatch(Boolean::booleanValue)) {
+                        // do something check
+                        FunctionSignatureTemplate functionResultTemplate =

Review Comment:
   why don't use `signature.argumentTemplates` that has data type for every 
argument? If you don't have any concern, I suggest to move this part into the 
`FunctionTemplate` instead.



##########
flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/extraction/FunctionTemplate.java:
##########
@@ -194,14 +194,17 @@ private static <T, H extends Annotation> T defaultAsNull(
                     "Argument and input hints cannot be declared in the same 
function hint.");
         }
 
+        Boolean[] argumentOptionals = null;
         if (argumentHints != null) {
             argumentHintNames = new String[argumentHints.length];
             argumentHintTypes = new DataTypeHint[argumentHints.length];
+            argumentOptionals = new Boolean[argumentHints.length];

Review Comment:
   Is var-length variable optional by default?



##########
flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/functions/sql/FlinkSqlOperatorTable.java:
##########
@@ -1286,4 +1287,6 @@ public List<SqlGroupedWindowFunction> 
getAuxiliaryFunctions() {
                     .operandTypeChecker(OperandTypes.NILADIC)
                     .notDeterministic()
                     .build();
+
+    public static final SqlSpecialOperator DEFAULT = 
SqlStdOperatorTable.DEFAULT;

Review Comment:
   Add some description about this. Can user uses this directly?



##########
flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/extraction/FunctionSignatureTemplate.java:
##########
@@ -44,19 +45,24 @@ final class FunctionSignatureTemplate {
 
     final @Nullable String[] argumentNames;
 
+    final @Nullable Boolean[] argumentOptionals;
+
     private FunctionSignatureTemplate(
             List<FunctionArgumentTemplate> argumentTemplates,
             boolean isVarArgs,
-            @Nullable String[] argumentNames) {
+            @Nullable String[] argumentNames,
+            @Nullable Boolean[] argumentOptionals) {

Review Comment:
   Do we need nullable here? If users doesn't specify optional explictly, we 
can assume it is necessary.



##########
flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/extraction/BaseMappingExtractor.java:
##########
@@ -143,6 +144,9 @@ protected Map<FunctionSignatureTemplate, 
FunctionResultTemplate> extractResultMa
                 // check if the method can be called
                 verifyMappingForMethod(correctMethod, 
collectedMappingsPerMethod, verification);
 
+                // check if we declare optional on a primitive type parameter
+                verifyOptionalOnPrimitiveParameter(correctMethod, 
collectedMappingsPerMethod);
+
                 // check if method strategies conflict with function strategies
                 collectedMappingsPerMethod.forEach(
                         (signature, result) -> putMapping(collectedMappings, 
signature, result));

Review Comment:
   The `putMapping` should check this when introducing optional keyword.
   ```
   class ToBytes extends ScalarFunction {
   
       bytes[] eval(String input, String format ="gzip") {}
   
       bytes[] eval(String input) {}
   }
   ```
   



##########
flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/extraction/BaseMappingExtractor.java:
##########
@@ -323,6 +327,37 @@ private void verifyMappingForMethod(
                         verification.verify(method, signature.toClass(), 
result.toClass()));
     }
 
+    private void verifyOptionalOnPrimitiveParameter(
+            Method method,
+            Map<FunctionSignatureTemplate, FunctionResultTemplate> 
collectedMappingsPerMethod) {
+
+        collectedMappingsPerMethod.forEach(

Review Comment:
   If we don't need result, we can use 
`collectedMappingsPerMethod.keySet().forEach` instead.



##########
flink-table/flink-table-planner/src/main/scala/org/apache/flink/table/planner/codegen/calls/StringCallGen.scala:
##########
@@ -237,6 +238,9 @@ object StringCallGen {
         val currentDatabase = ctx.addReusableQueryLevelCurrentDatabase()
         generateNonNullField(returnType, currentDatabase)
 
+      case DEFAULT =>

Review Comment:
   The current implementation seems we add a built-in function named "DEFAULT" 
here.  Does this mean user can not create a function named "DEFAULT" anymore?



##########
flink-table/flink-table-planner/src/main/scala/org/apache/flink/table/planner/plan/utils/FlinkRexUtil.scala:
##########
@@ -653,6 +655,36 @@ object FlinkRexUtil {
       rexBuilder,
       converter);
   }
+
+  /** Extracts the default types of operands from a given SqlCall that uses 
named arguments. */
+  def extractDefaultTypes(call: SqlCall, validator: SqlValidator): 
Array[RelDataType] = {
+    if (!hasAssignment(call)) {
+      return Array.empty
+    }
+
+    extractDefaultTypes(call.getOperator, validator.getTypeFactory)
+  }
+
+  /** Extracts the default types of operands for a given SqlOperator that uses 
named arguments. */
+  def extractDefaultTypes(
+      sqlOperator: SqlOperator,
+      typeFactory: RelDataTypeFactory): Array[RelDataType] = {
+    sqlOperator.getOperandTypeChecker
+      .asInstanceOf[SqlOperandMetadata]

Review Comment:
   I am a little worried about the design here. If `extractDefaultTypes` is 
common utils for others, it should handle all kinds of SqlOperator but the 
current implementation only process Operator that uses named argument. It's not 
a good design that users should check the implementation because others is not 
familiar as us.
   
   I suggest we add a class extends SqlCallBinding and move all this into the 
newly added class. WDYT?
   



##########
flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/inference/TypeInference.java:
##########
@@ -46,6 +46,8 @@ public final class TypeInference {
 
     private final @Nullable List<String> namedArguments;
 
+    private final @Nullable List<Boolean> optionalArguments;

Review Comment:
   I am worried about the current design of the Function Overload and Optional 
Parameter.
   
   The current implementation suggests that a UDF can only use either function 
overloading or optional parameters, not both.  But in some situation, we may 
need both abilities.
   
   ```
   class ToBytes extends ScalarFunction {
   
       bytes[] eval(String input, String format ="gzip") {}
   
       bytes[] eval(int input, String format = "gzip") {}
   }
   ```
   
   Actually, I think the argument optionals and names(alias) should belong to 
the `InputTypeStrategy`.  It's better we should open another FLIP to discuss 
the API design, cc @xuyangzhong @lincoln-lil 



##########
flink-table/flink-table-common/src/test/java/org/apache/flink/table/types/extraction/TypeInferenceExtractorTest.java:
##########
@@ -609,7 +611,23 @@ private static Stream<TestSpec> functionSpecs() {
                 // For function with overloaded function, argument name will 
be empty
                 TestSpec.forScalarFunction(
                         "Scalar function with overloaded functions and 
arguments hint declared.",
-                        
ArgumentsHintScalarFunctionWithOverloadedFunction.class));
+                        
ArgumentsHintScalarFunctionWithOverloadedFunction.class),
+                TestSpec.forScalarFunction(
+                                "Scalar function with argument type not null 
but optional.",
+                                
ArgumentHintNotNullTypeWithOptionalsScalarFunction.class)
+                        .expectErrorMessage(
+                                "Argument at position 0 is not null but a 
optional argument"),

Review Comment:
   Argument at position 0 is optional but its type deesn't accept null value.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to