okumin commented on code in PR #4090:
URL: https://github.com/apache/hive/pull/4090#discussion_r1199604574


##########
ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFArrayExcept.java:
##########
@@ -0,0 +1,59 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.hadoop.hive.ql.udf.generic;
+
+import org.apache.hadoop.hive.ql.exec.Description;
+import org.apache.hadoop.hive.ql.exec.UDFArgumentException;
+import org.apache.hadoop.hive.ql.metadata.HiveException;
+import org.apache.hadoop.hive.serde2.objectinspector.ListObjectInspector;
+import org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector;
+
+import java.util.List;
+import java.util.stream.Collectors;
+
+/**
+ * GenericUDFArrayExcept
+ */
+@Description(name = "array_except", value = "_FUNC_(array, value) - Returns an 
array of the elements in array1 but not in array2.", extended =
+    "Example:\n" + "  > SELECT _FUNC_(array(1, 2, 3,4), array(2,3)) FROM src 
LIMIT 1;\n"
+        + "  [1,4]") @NDV(maxNdv = 2) public class GenericUDFArrayExcept 
extends AbstractGenericUDFArrayBase {
+  static final int ARRAY2_IDX = 1;
+  private static final String FUNC_NAME = "ARRAY_EXCEPT";
+
+  public GenericUDFArrayExcept() {
+    super(FUNC_NAME, 2, 2, ObjectInspector.Category.LIST);
+  }
+
+  @Override public ObjectInspector initialize(ObjectInspector[] arguments) 
throws UDFArgumentException {
+    ObjectInspector defaultOI = super.initialize(arguments);
+    checkArgCategory(arguments, ARRAY2_IDX, ObjectInspector.Category.LIST, 
FUNC_NAME,
+        org.apache.hadoop.hive.serde.serdeConstants.LIST_TYPE_NAME); //Array1 
is already getting validated in Parent class

Review Comment:
   Thanks. I think we should carefully think of the expected specification 
first. What should the following SQL return?
   
   ```
   SELECT array_except(array(1, 2, 3), array(2.0, 3.3));
   ```
   
   If it should return `array(1, 3)`, meaning type conversion is applied, we 
should add a test case where elements are removed with type conversions.
   If it should return `array(1, 2, 3)`, meaning the second argument is 
meaningless if types are unmatched, I personally think we should raise a syntax 
error. That's because it happens only when a user misunderstands the types of 
the 1st and 2nd arguments.
   
   For example, Spark 3.4 fails in that case. PrestoSQL returns `[1.0, 3.0]` 
with the same SQL, meaning PrestoSQL applies the type conversion from int to 
float. I personally think either is fine, but it should be tested.
   
   ```
   spark-sql (default)> select array_except(array(1, 2, 3), array(2.0, 3.3));
   [DATATYPE_MISMATCH.BINARY_ARRAY_DIFF_TYPES] Cannot resolve 
"array_except(array(1, 2, 3), array(2.0, 3.3))" due to data type mismatch: 
Input to function `array_except` should have been two "ARRAY" with same element 
type, but it's ["ARRAY<INT>", "ARRAY<DECIMAL(2,1)>"].; line 1 pos 7;
   'Project [unresolvedalias(array_except(array(1, 2, 3), array(2.0, 3.3)), 
None)]
   +- OneRowRelation
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to