This is an automated email from the ASF dual-hosted git repository. dongjoon pushed a commit to branch branch-3.4 in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/branch-3.4 by this push: new 05f72fec83cf [SPARK-47774][INFRA][3.4] Remove redundant rules from `MimaExcludes` 05f72fec83cf is described below commit 05f72fec83cf3201ca76f0a6410b0ec7d87aa204 Author: Dongjoon Hyun <dh...@apple.com> AuthorDate: Tue Apr 9 00:52:58 2024 -0700 [SPARK-47774][INFRA][3.4] Remove redundant rules from `MimaExcludes` ### What changes were proposed in this pull request? This PR aims to remove redundant rules from `MimaExcludes` for Apache Spark 3.4.x. Previously, these rules were required due to the `dev/mima` limitation which is fixed at - https://github.com/apache/spark/pull/45938 ### Why are the changes needed? To minimize the exclusion rules for Apache Spark 3.4.x by removing the rules related to the following `private class`. - `DeployMessages` https://github.com/apache/spark/blob/d3c75540788cf4ce86558feb38c197fdc1c8300e/core/src/main/scala/org/apache/spark/deploy/DeployMessage.scala#L34 - `ShuffleBlockFetcherIterator` https://github.com/apache/spark/blob/d3c75540788cf4ce86558feb38c197fdc1c8300e/core/src/main/scala/org/apache/spark/storage/ShuffleBlockFetcherIterator.scala#L85-L86 - `BlockManagerMessages` https://github.com/apache/spark/blob/d3c75540788cf4ce86558feb38c197fdc1c8300e/core/src/main/scala/org/apache/spark/storage/BlockManagerMessages.scala#L25 ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? Pass the CIs. ### Was this patch authored or co-authored using generative AI tooling? No Closes #45949 from dongjoon-hyun/SPARK-47774-3.4. Authored-by: Dongjoon Hyun <dh...@apple.com> Signed-off-by: Dongjoon Hyun <dh...@apple.com> --- project/MimaExcludes.scala | 21 --------------------- 1 file changed, 21 deletions(-) diff --git a/project/MimaExcludes.scala b/project/MimaExcludes.scala index 0b0fdefd6b68..5e97a8d9551c 100644 --- a/project/MimaExcludes.scala +++ b/project/MimaExcludes.scala @@ -54,14 +54,6 @@ object MimaExcludes { ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.ml.classification.OneVsRest.extractInstances"), ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.ml.classification.OneVsRestModel.extractInstances"), - // [SPARK-39703][SPARK-39062] Mima complains with Scala 2.13 for the changes in DeployMessages - ProblemFilters.exclude[MissingTypesProblem]("org.apache.spark.deploy.DeployMessages$LaunchExecutor$"), - ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.deploy.DeployMessages#RequestExecutors.requestedTotal"), - ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.deploy.DeployMessages#RequestExecutors.copy"), - ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.deploy.DeployMessages#RequestExecutors.copy$default$2"), - ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.deploy.DeployMessages#RequestExecutors.this"), - ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.deploy.DeployMessages#RequestExecutors.apply"), - // [SPARK-38679][CORE] Expose the number of partitions in a stage to TaskContext ProblemFilters.exclude[ReversedMissingMethodProblem]("org.apache.spark.TaskContext.numPartitions"), @@ -115,25 +107,12 @@ object MimaExcludes { ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages#Shutdown.productElementName"), ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.scheduler.cluster.CoarseGrainedClusterMessages#Shutdown.productElementNames"), - // [SPARK-40950][CORE] Fix isRemoteAddressMaxedOut performance overhead on scala 2.13 - ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.storage.ShuffleBlockFetcherIterator#FetchRequest.blocks"), - ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.storage.ShuffleBlockFetcherIterator#FetchRequest.copy"), - ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.storage.ShuffleBlockFetcherIterator#FetchRequest.copy$default$2"), - ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.storage.ShuffleBlockFetcherIterator#FetchRequest.this"), - ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.storage.ShuffleBlockFetcherIterator#FetchRequest.apply"), - // [SPARK-41072][SS] Add the error class STREAM_FAILED to StreamingQueryException ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.sql.streaming.StreamingQueryException.this"), // [SPARK-41180][SQL] Reuse INVALID_SCHEMA instead of _LEGACY_ERROR_TEMP_1227 ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.sql.types.DataType.parseTypeWithFallback"), - // [SPARK-41360][CORE] Avoid BlockManager re-registration if the executor has been lost - ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.storage.BlockManagerMessages#RegisterBlockManager.copy"), - ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.storage.BlockManagerMessages#RegisterBlockManager.this"), - ProblemFilters.exclude[MissingTypesProblem]("org.apache.spark.storage.BlockManagerMessages$RegisterBlockManager$"), - ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.storage.BlockManagerMessages#RegisterBlockManager.apply"), - // [SPARK-41709][CORE][SQL][UI] Explicitly define Seq as collection.Seq to avoid toSeq when create ui objects from protobuf objects for Scala 2.13 ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.status.api.v1.ApplicationEnvironmentInfo.sparkProperties"), ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.status.api.v1.ApplicationEnvironmentInfo.hadoopProperties"), --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org