rahil-c commented on code in PR #18108:
URL: https://github.com/apache/hudi/pull/18108#discussion_r2805388502
##########
hudi-spark-datasource/hudi-spark/src/test/scala/org/apache/spark/sql/avro/TestSchemaConverters.scala:
##########
@@ -18,12 +18,15 @@
package org.apache.spark.sql.avro
import org.apache.hudi.avro.model.HoodieMetadataColumnStats
-import org.apache.hudi.common.schema.HoodieSchema
+import org.apache.hudi.common.schema.{HoodieSchema, HoodieSchemaField,
HoodieSchemaType}
import org.apache.avro.JsonProperties
-import org.junit.jupiter.api.Assertions.assertEquals
+import org.apache.spark.sql.types.{DataTypes, MetadataBuilder, StructField,
StructType}
+import org.junit.jupiter.api.Assertions.{assertEquals, assertFalse,
assertThrows, assertTrue}
import org.junit.jupiter.api.Test
+import java.util
+
class TestSchemaConverters {
Review Comment:
This is more a general comment in terms of where the tests should be placed.
Currently I noticed that all our testing for different types between Spark
struct type and HoodieSchema are done in the following class here:
https://github.com/apache/hudi/blob/master/hudi-spark-datasource/hudi-spark/src/test/scala/org/apache/hudi/TestHoodieSchemaConversionUtils.scala#L34
I have also put my vector Schema related tests inside the above class as
well:
https://github.com/apache/hudi/pull/18190/changes#diff-70e28ca555d1f86b32e1c2965c9c5435da6e59a0ac89f2d1ff24a54fe5af1f6aR573
Im wondering then we should we either continue to put tests in
TestHoodieSchemaConversionUtils or should we be having these new type tests in
TestSchemaConverters?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]