yihua commented on code in PR #12772:
URL: https://github.com/apache/hudi/pull/12772#discussion_r2364785931


##########
hudi-spark-datasource/hudi-spark-common/src/main/scala/org/apache/hudi/SparkBaseRowSerDe.scala:
##########
@@ -23,7 +23,7 @@ import org.apache.spark.sql.Row
 import org.apache.spark.sql.catalyst.InternalRow
 import org.apache.spark.sql.catalyst.encoders.ExpressionEncoder
 
-class Spark3RowSerDe(val encoder: ExpressionEncoder[Row]) extends 
SparkRowSerDe {
+class SparkBaseRowSerDe(val encoder: ExpressionEncoder[Row]) extends 
SparkRowSerDe {

Review Comment:
   nit: we may merge `SparkBaseRowSerDe` and `SparkRowSerDe` into one class 



##########
hudi-spark-datasource/hudi-spark-common/src/main/scala/org/apache/hudi/SparkHoodieTableFileIndex.scala:
##########
@@ -417,7 +417,7 @@ class SparkHoodieTableFileIndex(spark: SparkSession,
       schema,
       metaClient.getTableConfig.propsMap,
       configProperties.getString(DateTimeUtils.TIMEZONE_OPTION, 
SQLConf.get.sessionLocalTimeZone),
-      sparkParsePartitionUtil,
+      SparkBaseParsePartitionUtil,

Review Comment:
   If `SparkBaseParsePartitionUtil` is all the same across Spark versions, 
there is no need to pass this object around.



##########
hudi-spark-datasource/hudi-spark-common/src/main/scala/org/apache/hudi/HoodieBaseRelation.scala:
##########
@@ -502,7 +503,7 @@ abstract class HoodieBaseRelation(val sqlContext: 
SQLContext,
         tableStructSchema,
         tableConfig.propsMap,
         timeZoneId,
-        sparkAdapter.getSparkParsePartitionUtil,
+        SparkBaseParsePartitionUtil,

Review Comment:
   So we can get rid of `sparkAdapter.getSparkParsePartitionUtil` which is the 
same across all Spark versions?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to