amaliujia commented on code in PR #40931:
URL: https://github.com/apache/spark/pull/40931#discussion_r1176958568
##########
project/MimaExcludes.scala:
##########
@@ -66,6 +66,12 @@ object MimaExcludes {
ProblemFilters.exclude[Problem]("org.sparkproject.spark_core.protobuf.*"),
ProblemFilters.exclude[Problem]("org.apache.spark.status.protobuf.StoreTypes*"),
+ // SPARK-43265: Move Error framework to a common utils module
+
ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.QueryContext"),
+
ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.SparkException"),
+
ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.SparkException$"),
+
ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.SparkThrowable"),
Review Comment:
First all I am thinking existing Mima check just compile the `core` and
compare it with previous 3.4.0 so we are moving out classes thus seeing the
report is expected.
If we want to do a step further to say even after moving the classes, we can
still check the compatibility between core 3.4.0 and `core+common-utils`,
probably we can follow what `CheckConnectJvmClientCompatibility` does:
1. Pre-run a command to compile a jar that include both `core` and
`common-util` (not sure if SBT can do this. The least is we can unzip jar files
and re-zip the combined one).
2. The new suite locates the jar from 1., and then also find the 3.4.0 jar.
3. Then the suite compare two jars.
However I am not sure if we want to go to this path.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]