yihua commented on code in PR #12964:
URL: https://github.com/apache/hudi/pull/12964#discussion_r2110786854
##########
hudi-flink-datasource/hudi-flink/pom.xml:
##########
@@ -348,6 +348,11 @@
</exclusion>
</exclusions>
</dependency>
+ <dependency>
+ <groupId>javax.validation</groupId>
+ <artifactId>validation-api</artifactId>
+ <version>2.0.1.Final</version>
+ </dependency>
Review Comment:
Same to see if we can avoid this dependency?
##########
hudi-common/pom.xml:
##########
@@ -114,6 +114,11 @@
<artifactId>jol-core</artifactId>
</dependency>
+ <dependency>
+ <groupId>javax.xml.bind</groupId>
+ <artifactId>jaxb-api</artifactId>
+ </dependency>
Review Comment:
Similarly, revisit this dependency.
##########
hudi-client/hudi-spark-client/src/main/scala/org/apache/spark/sql/HoodieInternalRowUtils.scala:
##########
@@ -34,9 +32,8 @@ import org.apache.spark.sql.types._
import org.apache.spark.unsafe.types.UTF8String
import java.util.concurrent.ConcurrentHashMap
-import java.util.function.{Function => JFunction}
+import java.util.function.{Supplier, Function => JFunction}
import java.util.{ArrayDeque => JArrayDeque, Collections => JCollections,
Deque => JDeque, Map => JMap}
-
Review Comment:
keep import grouping
##########
hudi-client/hudi-spark-client/src/main/scala/org/apache/spark/sql/HoodieInternalRowUtils.scala:
##########
@@ -21,9 +21,7 @@ package org.apache.spark.sql
import org.apache.hudi.AvroConversionUtils.convertAvroSchemaToStructType
import org.apache.hudi.avro.HoodieAvroUtils.{createFullName, toJavaDate}
import org.apache.hudi.exception.HoodieException
-
Review Comment:
keep import grouping
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]