wombatu-kun commented on code in PR #12772:
URL: https://github.com/apache/hudi/pull/12772#discussion_r2097554346
##########
hudi-spark-datasource/hudi-spark/src/main/scala/org/apache/spark/sql/hudi/analysis/HoodieAnalysis.scala:
##########
@@ -66,24 +69,33 @@ object HoodieAnalysis extends SparkAdapterSupport {
val dataSourceV2ToV1Fallback: RuleBuilder =
session => instantiateKlass(dataSourceV2ToV1FallbackClass, session)
- val spark3ResolveReferencesClass =
"org.apache.spark.sql.hudi.analysis.HoodieSpark3ResolveReferences"
- val spark3ResolveReferences: RuleBuilder =
- session => instantiateKlass(spark3ResolveReferencesClass, session)
+ val spark3PlusResolveReferencesClass = if (HoodieSparkUtils.isSpark4)
+ "org.apache.spark.sql.hudi.analysis.HoodieSpark4ResolveReferences"
+ else
+ "org.apache.spark.sql.hudi.analysis.HoodieSpark3ResolveReferences"
+ val spark3PlusResolveReferences: RuleBuilder =
+ session => instantiateKlass(spark3PlusResolveReferencesClass, session)
// NOTE: PLEASE READ CAREFULLY BEFORE CHANGING
//
// It's critical for this rules to follow in this order; re-ordering this
rules might lead to changes in
// behavior of Spark's analysis phase (for ex, DataSource V2 to V1
fallback might not kick in before other rules,
// leading to all relations resolving as V2 instead of current expectation
of them being resolved as V1)
- rules ++= Seq(dataSourceV2ToV1Fallback, spark3ResolveReferences)
+ rules ++= Seq(dataSourceV2ToV1Fallback, spark3PlusResolveReferences)
- if (HoodieSparkUtils.gteqSpark3_5) {
+ if (HoodieSparkUtils.isSpark3_5) {
rules += (_ => instantiateKlass(
"org.apache.spark.sql.hudi.analysis.HoodieSpark35ResolveColumnsForInsertInto"))
}
+ if (HoodieSparkUtils.isSpark4_0) {
+ rules += (_ => instantiateKlass(
+
"org.apache.spark.sql.hudi.analysis.HoodieSpark40ResolveColumnsForInsertInto"))
+ }
val resolveAlterTableCommandsClass =
- if (HoodieSparkUtils.gteqSpark3_5) {
+ if (HoodieSparkUtils.gteqSpark4_0) {
+ "org.apache.spark.sql.hudi.Spark40ResolveHudiAlterTableCommand"
Review Comment:
`ResolveHudiAlterTableCommand` for Spark 3.3 is different from other Spark
versions. If we put the code that is common for 3.4,3.5,4.0 versions to
hudi-spark-common - it fails to compile with Spark 3.3. So we leave it as is
while Spark 3.3 is supported. Jira
https://issues.apache.org/jira/browse/HUDI-9428
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]