dawidwys commented on a change in pull request #10606: 
[FLINK-15009][table-common] Add a utility for creating type inference logic via 
reflection
URL: https://github.com/apache/flink/pull/10606#discussion_r363329380
 
 

 ##########
 File path: 
flink-table/flink-table-api-scala/src/test/scala/org/apache/flink/table/types/extraction/DataTypeExtractorScalaTest.scala
 ##########
 @@ -50,12 +52,36 @@ class DataTypeExtractorScalaTest {
     // many annotations that partially override each other
     TestSpec
         .forType(classOf[ScalaSimplePojoWithManyAnnotations])
-        
.expectDataType(getSimplePojoDataType(classOf[ScalaSimplePojoWithManyAnnotations]))
+        
.expectDataType(getSimplePojoDataType(classOf[ScalaSimplePojoWithManyAnnotations])),
+
+    // invalid Scala tuple
+    TestSpec
+        .forType(classOf[ScalaPojoWithInvalidTuple])
+        .expectErrorMessage("Scala tuples are not supported. Use case classes 
or 'org.apache.flink.types.Row' instead."),
+
+    // invalid Scala map
+    TestSpec
+        .forType(classOf[ScalaPojoWithInvalidMap])
+        .expectErrorMessage("Scala collections are not supported. " +
+          "See the documentation for supported classes or treat them as RAW 
types.")
   )
 
   @Test
   def testScalaExtraction(): Unit = {
-    parameters.foreach(runExtraction)
+    parameters.foreach { testSpec =>
 
 Review comment:
   Why can't we have a parameterized test here? We could use the 
`ExpectedException` rule then.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to