yihua commented on code in PR #11692:
URL: https://github.com/apache/hudi/pull/11692#discussion_r1717598455


##########
hudi-spark-datasource/hudi-spark/src/test/scala/org/apache/hudi/TestHoodieSparkSqlWriter.scala:
##########
@@ -1274,7 +1274,7 @@ object TestHoodieSparkSqlWriter {
 
     // NOTE: Hudi doesn't support Orc in Spark < 3.0
     //       Please check HUDI-4496 for more details
-    val targetScenarios = if (HoodieSparkUtils.gteqSpark3_0) {
+    val targetScenarios = if (HoodieSparkUtils.gteqSpark3_3) {

Review Comment:
   Let's add `TODO(HUDI-WXYZ)` for easy tracking of fixing these in a follow-up.
   
   In this particular case, once Spark 2 is removed, there is no need to check 
Spark version.



##########
hudi-spark-datasource/hudi-spark3.3.x/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/Spark33LegacyHoodieParquetFileFormat.scala:
##########
@@ -271,7 +255,7 @@ class Spark33LegacyHoodieParquetFileFormat(private val 
shouldAppendPartitionValu
               
DataSourceUtils.int96RebaseSpec(footerFileMetaData.getKeyValueMetaData.get, 
int96RebaseModeInRead)
             val datetimeRebaseSpec =
               
DataSourceUtils.datetimeRebaseSpec(footerFileMetaData.getKeyValueMetaData.get, 
datetimeRebaseModeInRead)
-            new Spark32PlusHoodieVectorizedParquetRecordReader(
+            new Spark3HoodieVectorizedParquetRecordReader(

Review Comment:
   Could you file a ticket to look at Spark's parquet file format class in 
recent Spark versions and see if there is any change that need to ported to our 
file format class?



##########
hudi-spark-datasource/hudi-spark3.3.x/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/Spark33LegacyHoodieParquetFileFormat.scala:
##########
@@ -174,35 +173,20 @@ class Spark33LegacyHoodieParquetFileFormat(private val 
shouldAppendPartitionValu
       // Try to push down filters when filter push-down is enabled.
       val pushed = if (enableParquetFilterPushDown) {
         val parquetSchema = footerFileMetaData.getSchema
-        val parquetFilters = if (HoodieSparkUtils.gteqSpark3_2_1) {
-          // NOTE: Below code could only be compiled against >= Spark 3.2.1,
-          //       and unfortunately won't compile against Spark 3.2.0
-          //       However this code is runtime-compatible w/ both Spark 3.2.0 
and >= Spark 3.2.1
-          val datetimeRebaseSpec =
+        // NOTE: Below code could only be compiled against >= Spark 3.2.1,
+        //       and unfortunately won't compile against Spark 3.2.0
+        //       However this code is runtime-compatible w/ both Spark 3.2.0 
and >= Spark 3.2.1

Review Comment:
   This note can be removed.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to