MasseGuillaume commented on code in PR #3207:
URL: https://github.com/apache/calcite/pull/3207#discussion_r1220712509


##########
testkit/src/main/java/org/apache/calcite/test/SqlOperatorTest.java:
##########
@@ -5364,6 +5364,31 @@ private static void checkIf(SqlOperatorFixture f) {
     f.checkNull("array_concat(cast(null as integer array), array[1])");
   }
 
+  /** Tests {@code ARRAY_CONTAINS} function from Spark. */
+  @Test void testArrayContainsFunc() {
+    final SqlOperatorFixture f0 = fixture();
+    f0.setFor(SqlLibraryOperators.ARRAY_CONTAINS);
+    f0.checkFails("^array_contains(array[1, 2], 1)^",
+        "No match found for function signature "
+            + "ARRAY_CONTAINS\\(<INTEGER ARRAY>, <NUMERIC>\\)", false);
+
+    final SqlOperatorFixture f = f0.withLibrary(SqlLibrary.SPARK);
+    f.checkScalar("array_contains(array[1, 2], 1)", true,
+        "BOOLEAN NOT NULL");
+    f.checkScalar("array_contains(array[1, null], cast(null as integer))", 
true,
+        "BOOLEAN NOT NULL");

Review Comment:
   Do we want to type check exactly as Apache Spark does?
   
   ```
   spark.sql("select array_contains(array(1, null), null)").show()
   org.apache.spark.sql.AnalysisException: cannot resolve 
'array_contains(array(1, CAST(NULL AS INT)), NULL)' due to data type mismatch: 
Null typed values cannot be used as arguments; line 1 pos 7;
   ```



##########
testkit/src/main/java/org/apache/calcite/test/SqlOperatorTest.java:
##########
@@ -5364,6 +5364,31 @@ private static void checkIf(SqlOperatorFixture f) {
     f.checkNull("array_concat(cast(null as integer array), array[1])");
   }
 
+  /** Tests {@code ARRAY_CONTAINS} function from Spark. */
+  @Test void testArrayContainsFunc() {
+    final SqlOperatorFixture f0 = fixture();
+    f0.setFor(SqlLibraryOperators.ARRAY_CONTAINS);
+    f0.checkFails("^array_contains(array[1, 2], 1)^",
+        "No match found for function signature "
+            + "ARRAY_CONTAINS\\(<INTEGER ARRAY>, <NUMERIC>\\)", false);
+
+    final SqlOperatorFixture f = f0.withLibrary(SqlLibrary.SPARK);
+    f.checkScalar("array_contains(array[1, 2], 1)", true,
+        "BOOLEAN NOT NULL");
+    f.checkScalar("array_contains(array[1, null], cast(null as integer))", 
true,
+        "BOOLEAN NOT NULL");
+    f.checkScalar("array_contains(array[1], 1)", true,
+        "BOOLEAN NOT NULL");
+    f.checkScalar("array_contains(array(), 1)", false,
+        "BOOLEAN NOT NULL");
+    f.checkScalar("array_contains(array[map[1, 'a'], map[2, 'b']], map[1, 
'a'])", true,
+        "BOOLEAN NOT NULL");

Review Comment:
   ```
   spark.sql("""select array_contains(array(map(1, "1"), map(2, "2")), map(2, 
"2"))""").show()
   org.apache.spark.sql.AnalysisException: cannot resolve 
'array_contains(array(map(1, '1'), map(2, '2')), map(2, '2'))' due to data type 
mismatch: function array_contains does not support ordering on type 
map<int,string>; line 1 pos 7;
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to