This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 1930116  [SPARK-38040][BUILD] Enable binary compatibility check for 
APIs in Catalyst, KVStore and Avro modules
1930116 is described below

commit 193011632ba41dc4035460c429374981a8ebe0b7
Author: Hyukjin Kwon <[email protected]>
AuthorDate: Thu Jan 27 13:36:27 2022 +0900

    [SPARK-38040][BUILD] Enable binary compatibility check for APIs in 
Catalyst, KVStore and Avro modules
    
    ### What changes were proposed in this pull request?
    
    We don't currently run binary compatibility check in below modules:
    
    ```
    [info] spark-parent: mimaPreviousArtifacts not set, not analyzing binary 
compatibility
    [info] spark-network-common: mimaPreviousArtifacts not set, not analyzing 
binary compatibility
    [info] spark-tags: mimaPreviousArtifacts not set, not analyzing binary 
compatibility
    [info] spark-unsafe: mimaPreviousArtifacts not set, not analyzing binary 
compatibility
    [info] spark-network-shuffle: mimaPreviousArtifacts not set, not analyzing 
binary compatibility
    [info] spark-kvstore: mimaPreviousArtifacts not set, not analyzing binary 
compatibility
    [info] spark-tools: mimaPreviousArtifacts not set, not analyzing binary 
compatibility
    [info] spark-token-provider-kafka-0-10: mimaPreviousArtifacts not set, not 
analyzing binary compatibility
    [info] spark-streaming-kafka-0-10-assembly: mimaPreviousArtifacts not set, 
not analyzing binary compatibility
    [info] spark-catalyst: mimaPreviousArtifacts not set, not analyzing binary 
compatibility
    [info] spark-repl: mimaPreviousArtifacts not set, not analyzing binary 
compatibility
    [info] spark-avro: mimaPreviousArtifacts not set, not analyzing binary 
compatibility
    [info] spark-sql-kafka-0-10: mimaPreviousArtifacts not set, not analyzing 
binary compatibility
    [info] spark-hive: mimaPreviousArtifacts not set, not analyzing binary 
compatibility
    [info] spark-assembly: mimaPreviousArtifacts not set, not analyzing binary 
compatibility
    [info] spark-examples: mimaPreviousArtifacts not set, not analyzing binary 
compatibility
    ```
    
    However, there are some APIs under these modules. For example, 
https://github.com/apache/spark/blob/master/external/avro/src/main/scala/org/apache/spark/sql/avro/functions.scala
 for Avro,  
https://github.com/apache/spark/tree/master/common/kvstore/src/main/java/org/apache/spark/util/kvstore
 for KVStore (to be API), and 
https://github.com/apache/spark/tree/master/sql/catalyst/src/main/java/org/apache/spark/sql/connector
 for Catalyst
    
    ### Why are the changes needed?
    
    To detect binary compatibility.
    
    ### Does this PR introduce _any_ user-facing change?
    
    No, dev-only.
    
    ### How was this patch tested?
    
    Manually tested via running `dev/mima`.
    
    Closes #35339 from HyukjinKwon/SPARK-38040.
    
    Authored-by: Hyukjin Kwon <[email protected]>
    Signed-off-by: Hyukjin Kwon <[email protected]>
---
 project/MimaExcludes.scala | 6 ++++++
 project/SparkBuild.scala   | 4 ++--
 2 files changed, 8 insertions(+), 2 deletions(-)

diff --git a/project/MimaExcludes.scala b/project/MimaExcludes.scala
index b985f95..f77bc5c 100644
--- a/project/MimaExcludes.scala
+++ b/project/MimaExcludes.scala
@@ -66,6 +66,12 @@ object MimaExcludes {
     ProblemFilters.exclude[Problem]("org.apache.spark.sql.catalyst.*"),
     ProblemFilters.exclude[Problem]("org.apache.spark.sql.execution.*"),
     ProblemFilters.exclude[Problem]("org.apache.spark.sql.internal.*"),
+    ProblemFilters.exclude[Problem]("org.apache.spark.sql.errors.*"),
+    // DSv2 catalog and expression APIs are unstable yet. We should enable 
this back.
+    
ProblemFilters.exclude[Problem]("org.apache.spark.sql.connector.catalog.*"),
+    
ProblemFilters.exclude[Problem]("org.apache.spark.sql.connector.expressions.*"),
+    // Avro source implementation is internal.
+    ProblemFilters.exclude[Problem]("org.apache.spark.sql.v2.avro.*"),
 
     // [SPARK-34848][CORE] Add duration to TaskMetricDistributions
     
ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.status.api.v1.TaskMetricDistributions.this"),
diff --git a/project/SparkBuild.scala b/project/SparkBuild.scala
index 02ffa23..ad9aef5 100644
--- a/project/SparkBuild.scala
+++ b/project/SparkBuild.scala
@@ -376,8 +376,8 @@ object SparkBuild extends PomBuild {
 
   val mimaProjects = allProjects.filterNot { x =>
     Seq(
-      spark, hive, hiveThriftServer, catalyst, repl, networkCommon, 
networkShuffle, networkYarn,
-      unsafe, tags, tokenProviderKafka010, sqlKafka010, kvstore, avro
+      spark, hive, hiveThriftServer, repl, networkCommon, networkShuffle, 
networkYarn,
+      unsafe, tags, tokenProviderKafka010, sqlKafka010
     ).contains(x)
   }
 

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to