This is an automated email from the ASF dual-hosted git repository.
hvanhovell pushed a commit to branch branch-3.4
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/branch-3.4 by this push:
new 61a666d158b [SPARK-42640][CONNECT] Remove stale entries from the
excluding rules for CompatibilitySuite
61a666d158b is described below
commit 61a666d158b81c5d36b9c2507b6653ff22b1af76
Author: Rui Wang <[email protected]>
AuthorDate: Thu Mar 2 16:54:51 2023 -0400
[SPARK-42640][CONNECT] Remove stale entries from the excluding rules for
CompatibilitySuite
### What changes were proposed in this pull request?
Remove stale entries from the excluding rules for CompatibilitySuite.
### Why are the changes needed?
Keep API compatibility list in sync.
### Does this PR introduce _any_ user-facing change?
NO
### How was this patch tested?
UT
Closes #40241 from amaliujia/remove_entries.
Authored-by: Rui Wang <[email protected]>
Signed-off-by: Herman van Hovell <[email protected]>
(cherry picked from commit 2e6ce42585ccb3e6140980580aaf6cbc160f7eef)
Signed-off-by: Herman van Hovell <[email protected]>
---
.../spark/sql/connect/client/CheckConnectJvmClientCompatibility.scala | 2 --
1 file changed, 2 deletions(-)
diff --git
a/connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/CheckConnectJvmClientCompatibility.scala
b/connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/CheckConnectJvmClientCompatibility.scala
index 4f4ca9ad990..548608f50b5 100644
---
a/connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/CheckConnectJvmClientCompatibility.scala
+++
b/connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/client/CheckConnectJvmClientCompatibility.scala
@@ -143,7 +143,6 @@ object CheckConnectJvmClientCompatibility {
ProblemFilters.exclude[Problem]("org.apache.spark.sql.Dataset.queryExecution"),
ProblemFilters.exclude[Problem]("org.apache.spark.sql.Dataset.encoder"),
ProblemFilters.exclude[Problem]("org.apache.spark.sql.Dataset.sqlContext"),
- ProblemFilters.exclude[Problem]("org.apache.spark.sql.Dataset.as"),
ProblemFilters.exclude[Problem]("org.apache.spark.sql.Dataset.na"),
ProblemFilters.exclude[Problem]("org.apache.spark.sql.Dataset.stat"),
ProblemFilters.exclude[Problem]("org.apache.spark.sql.Dataset.joinWith"),
@@ -158,7 +157,6 @@ object CheckConnectJvmClientCompatibility {
ProblemFilters.exclude[Problem]("org.apache.spark.sql.Dataset.flatMap"),
ProblemFilters.exclude[Problem]("org.apache.spark.sql.Dataset.foreach"),
ProblemFilters.exclude[Problem]("org.apache.spark.sql.Dataset.foreachPartition"),
- ProblemFilters.exclude[Problem]("org.apache.spark.sql.Dataset.persist"),
ProblemFilters.exclude[Problem]("org.apache.spark.sql.Dataset.storageLevel"),
ProblemFilters.exclude[Problem]("org.apache.spark.sql.Dataset.rdd"),
ProblemFilters.exclude[Problem]("org.apache.spark.sql.Dataset.toJavaRDD"),
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]