This is an automated email from the ASF dual-hosted git repository.
lzljs3620320 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/paimon.git
The following commit(s) were added to refs/heads/master by this push:
new ab3cb79a7 [spark] remove spark 3.1 (#4237)
ab3cb79a7 is described below
commit ab3cb79a71e4c125fd18fc8d1571d01d25bdfc4d
Author: Yann Byron <[email protected]>
AuthorDate: Tue Sep 24 10:39:30 2024 +0800
[spark] remove spark 3.1 (#4237)
---
docs/content/engines/overview.md | 18 +--
docs/content/project/download.md | 2 -
docs/content/spark/quick-start.md | 8 +-
docs/content/spark/sql-query.md | 2 -
.../apache/paimon/spark/PaimonInputPartition.scala | 25 -----
.../apache/paimon/spark/PaimonScanBuilder.scala | 23 ----
.../analysis/PaimonIncompatiblePHRRules.scala | 30 -----
.../PaimonIncompatibleResolutionRules.scala | 30 -----
.../spark/catalyst/analysis/PaimonMergeInto.scala | 57 ----------
.../spark/catalyst/trees/PaimonLeafLike.scala | 28 -----
.../org/apache/paimon/spark/leafnode/package.scala | 37 -------
.../paimon/spark/statistics/StatisticsHelper.scala | 30 -----
.../apache/paimon/spark/util/SQLConfUtils.scala | 28 -----
.../apache/paimon/spark/util/shim/TypeUtils.scala | 25 -----
.../paimon/spark/SparkGenericCatalogTest.java | 123 ---------------------
.../src/test/resources/log4j2-test.properties | 28 -----
paimon-spark/pom.xml | 1 -
17 files changed, 10 insertions(+), 485 deletions(-)
diff --git a/docs/content/engines/overview.md b/docs/content/engines/overview.md
index 264071d99..7dba9b49f 100644
--- a/docs/content/engines/overview.md
+++ b/docs/content/engines/overview.md
@@ -28,15 +28,15 @@ under the License.
## Compatibility Matrix
-| Engine
| Version | Batch Read | Batch Write | Create Table | Alter Table
| Streaming Write | Streaming Read | Batch Overwrite | DELETE & UPDATE |
MERGE INTO | Time Travel |
-|:-------------------------------------------------------------------------------:|:-------------:|:-----------:|:-----------:|:-------------:|:-------------:|:----------------:|:----------------:|:---------------:|:------------------:|:-----------:|:-----------:|
-| Flink
| 1.15 - 1.20 | ✅ | ✅ | ✅ | ✅(1.17+) |
✅ | ✅ | ✅ | ✅(1.17+) | ❌
| ✅ |
-| Spark
| 3.1 - 3.5 | ✅ | ✅(3.2+) | ✅ | ✅ |
✅(3.3+) | ✅(3.3+) | ✅(3.2+) | ✅(3.2+) | ✅(3.2+)
| ✅(3.3+) |
-| Hive
| 2.1 - 3.1 | ✅ | ✅ | ✅ | ❌ |
❌ | ❌ | ❌ | ❌ | ❌
| ✅ |
-| Trino
| 420 - 439 | ✅ | ✅(427+) | ✅(427+) | ✅(427+) |
❌ | ❌ | ❌ | ❌ | ❌
| ✅ |
-| Presto
| 0.236 - 0.280 | ✅ | ❌ | ✅ | ✅ |
❌ | ❌ | ❌ | ❌ | ❌
| ❌ |
-|
[StarRocks](https://docs.starrocks.io/docs/data_source/catalog/paimon_catalog/)
| 3.1+ | ✅ | ❌ | ❌ | ❌ |
❌ | ❌ | ❌ | ❌ | ❌ |
✅ |
-| [Doris](https://doris.apache.org/docs/lakehouse/datalake-analytics/paimon)
| 2.0.6+ | ✅ | ❌ | ❌ | ❌ |
❌ | ❌ | ❌ | ❌ | ❌
| ✅ |
+| Engine
| Version | Batch Read | Batch Write | Create Table | Alter Table
| Streaming Write | Streaming Read | Batch Overwrite | DELETE & UPDATE |
MERGE INTO | Time Travel |
+|:-------------------------------------------------------------------------------:|:-------------:|:-----------:|:-----------:|:-------------:|:-------------:|:----------------:|:----------------:|:---------------:|:---------------:|:----------:|:-----------:|
+| Flink
| 1.15 - 1.20 | ✅ | ✅ | ✅ | ✅(1.17+) |
✅ | ✅ | ✅ | ✅(1.17+) | ❌
| ✅ |
+| Spark
| 3.2 - 3.5 | ✅ | ✅ | ✅ | ✅ |
✅(3.3+) | ✅(3.3+) | ✅ | ✅ | ✅
| ✅(3.3+) |
+| Hive
| 2.1 - 3.1 | ✅ | ✅ | ✅ | ❌ |
❌ | ❌ | ❌ | ❌ | ❌
| ✅ |
+| Trino
| 420 - 439 | ✅ | ✅(427+) | ✅(427+) | ✅(427+) |
❌ | ❌ | ❌ | ❌ | ❌
| ✅ |
+| Presto
| 0.236 - 0.280 | ✅ | ❌ | ✅ | ✅ |
❌ | ❌ | ❌ | ❌ | ❌
| ❌ |
+|
[StarRocks](https://docs.starrocks.io/docs/data_source/catalog/paimon_catalog/)
| 3.1+ | ✅ | ❌ | ❌ | ❌ |
❌ | ❌ | ❌ | ❌ | ❌ |
✅ |
+| [Doris](https://doris.apache.org/docs/lakehouse/datalake-analytics/paimon)
| 2.0.6+ | ✅ | ❌ | ❌ | ❌ |
❌ | ❌ | ❌ | ❌ | ❌
| ✅ |
## Streaming Engines
diff --git a/docs/content/project/download.md b/docs/content/project/download.md
index f9189132a..5e4981107 100644
--- a/docs/content/project/download.md
+++ b/docs/content/project/download.md
@@ -45,7 +45,6 @@ This documentation is a guide for downloading Paimon Jars.
| Spark 3.4 | [paimon-spark-3.4-{{< version
>}}.jar](https://repository.apache.org/snapshots/org/apache/paimon/paimon-spark-3.4/{{<
version >}}/) |
| Spark 3.3 | [paimon-spark-3.3-{{< version
>}}.jar](https://repository.apache.org/snapshots/org/apache/paimon/paimon-spark-3.3/{{<
version >}}/) |
| Spark 3.2 | [paimon-spark-3.2-{{< version
>}}.jar](https://repository.apache.org/snapshots/org/apache/paimon/paimon-spark-3.2/{{<
version >}}/) |
-| Spark 3.1 | [paimon-spark-3.1-{{< version
>}}.jar](https://repository.apache.org/snapshots/org/apache/paimon/paimon-spark-3.1/{{<
version >}}/) |
| Hive 3.1 | [paimon-hive-connector-3.1-{{< version
>}}.jar](https://repository.apache.org/snapshots/org/apache/paimon/paimon-hive-connector-3.1/{{<
version >}}/) |
| Hive 2.3 | [paimon-hive-connector-2.3-{{< version
>}}.jar](https://repository.apache.org/snapshots/org/apache/paimon/paimon-hive-connector-2.3/{{<
version >}}/) |
| Hive 2.2 | [paimon-hive-connector-2.2-{{< version
>}}.jar](https://repository.apache.org/snapshots/org/apache/paimon/paimon-hive-connector-2.2/{{<
version >}}/) |
@@ -75,7 +74,6 @@ This documentation is a guide for downloading Paimon Jars.
| Spark 3.4 | [paimon-spark-3.4-{{< version
>}}.jar](https://repo.maven.apache.org/maven2/org/apache/paimon/paimon-spark-3.4/{{<
version >}}/paimon-spark-3.4-{{< version >}}.jar)
|
| Spark 3.3 | [paimon-spark-3.3-{{< version
>}}.jar](https://repo.maven.apache.org/maven2/org/apache/paimon/paimon-spark-3.3/{{<
version >}}/paimon-spark-3.3-{{< version >}}.jar)
|
| Spark 3.2 | [paimon-spark-3.2-{{< version
>}}.jar](https://repo.maven.apache.org/maven2/org/apache/paimon/paimon-spark-3.2/{{<
version >}}/paimon-spark-3.2-{{< version >}}.jar)
|
-| Spark 3.1 | [paimon-spark-3.1-{{< version
>}}.jar](https://repo.maven.apache.org/maven2/org/apache/paimon/paimon-spark-3.1/{{<
version >}}/paimon-spark-3.1-{{< version >}}.jar)
|
| Hive 3.1 | [paimon-hive-connector-3.1-{{< version
>}}.jar](https://repo.maven.apache.org/maven2/org/apache/paimon/paimon-hive-connector-3.1/{{<
version >}}/paimon-hive-connector-3.1-{{< version >}}.jar)
|
| Hive 2.3 | [paimon-hive-connector-2.3-{{< version
>}}.jar](https://repo.maven.apache.org/maven2/org/apache/paimon/paimon-hive-connector-2.3/{{<
version >}}/paimon-hive-connector-2.3-{{< version >}}.jar)
|
| Hive 2.2 | [paimon-hive-connector-2.2-{{< version
>}}.jar](https://repo.maven.apache.org/maven2/org/apache/paimon/paimon-hive-connector-2.2/{{<
version >}}/paimon-hive-connector-2.2-{{< version >}}.jar)
|
diff --git a/docs/content/spark/quick-start.md
b/docs/content/spark/quick-start.md
index 6687d2959..04f53bf50 100644
--- a/docs/content/spark/quick-start.md
+++ b/docs/content/spark/quick-start.md
@@ -28,7 +28,7 @@ under the License.
## Preparation
-Paimon currently supports Spark 3.5, 3.4, 3.3, 3.2 and 3.1. We recommend the
latest Spark version for a better experience.
+Paimon currently supports Spark 3.5, 3.4, 3.3, and 3.2. We recommend the
latest Spark version for a better experience.
Download the jar file with corresponding version.
@@ -40,7 +40,6 @@ Download the jar file with corresponding version.
| Spark 3.4 | [paimon-spark-3.4-{{< version
>}}.jar](https://repo.maven.apache.org/maven2/org/apache/paimon/paimon-spark-3.4/{{<
version >}}/paimon-spark-3.4-{{< version >}}.jar) |
| Spark 3.3 | [paimon-spark-3.3-{{< version
>}}.jar](https://repo.maven.apache.org/maven2/org/apache/paimon/paimon-spark-3.3/{{<
version >}}/paimon-spark-3.3-{{< version >}}.jar) |
| Spark 3.2 | [paimon-spark-3.2-{{< version
>}}.jar](https://repo.maven.apache.org/maven2/org/apache/paimon/paimon-spark-3.2/{{<
version >}}/paimon-spark-3.2-{{< version >}}.jar) |
-| Spark 3.1 | [paimon-spark-3.1-{{< version
>}}.jar](https://repo.maven.apache.org/maven2/org/apache/paimon/paimon-spark-3.1/{{<
version >}}/paimon-spark-3.1-{{< version >}}.jar) |
{{< /stable >}}
@@ -52,7 +51,6 @@ Download the jar file with corresponding version.
| Spark 3.4 | [paimon-spark-3.4-{{< version
>}}.jar](https://repository.apache.org/snapshots/org/apache/paimon/paimon-spark-3.4/{{<
version >}}/) |
| Spark 3.3 | [paimon-spark-3.3-{{< version
>}}.jar](https://repository.apache.org/snapshots/org/apache/paimon/paimon-spark-3.3/{{<
version >}}/) |
| Spark 3.2 | [paimon-spark-3.2-{{< version
>}}.jar](https://repository.apache.org/snapshots/org/apache/paimon/paimon-spark-3.2/{{<
version >}}/) |
-| Spark 3.1 | [paimon-spark-3.1-{{< version
>}}.jar](https://repository.apache.org/snapshots/org/apache/paimon/paimon-spark-3.1/{{<
version >}}/) |
{{< /unstable >}}
@@ -179,10 +177,6 @@ tblproperties (
## Insert Table
-{{< hint info >}}
-Paimon currently supports Spark 3.2+ for SQL write.
-{{< /hint >}}
-
```sql
INSERT INTO my_table VALUES (1, 'Hi'), (2, 'Hello');
```
diff --git a/docs/content/spark/sql-query.md b/docs/content/spark/sql-query.md
index 3543355bc..e118b8418 100644
--- a/docs/content/spark/sql-query.md
+++ b/docs/content/spark/sql-query.md
@@ -75,8 +75,6 @@ For example:
By default, will scan changelog files for the table which produces changelog
files. Otherwise, scan newly changed files.
You can also force specifying `'incremental-between-scan-mode'`.
-Requires Spark 3.2+.
-
Paimon supports that use Spark SQL to do the incremental query that
implemented by Spark Table Valued Function.
you can use `paimon_incremental_query` in query to extract the incremental
data:
diff --git
a/paimon-spark/paimon-spark-3.1/src/main/scala/org/apache/paimon/spark/PaimonInputPartition.scala
b/paimon-spark/paimon-spark-3.1/src/main/scala/org/apache/paimon/spark/PaimonInputPartition.scala
deleted file mode 100644
index 49bc71e93..000000000
---
a/paimon-spark/paimon-spark-3.1/src/main/scala/org/apache/paimon/spark/PaimonInputPartition.scala
+++ /dev/null
@@ -1,25 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements. See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership. The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.paimon.spark
-
-import org.apache.paimon.table.source.Split
-
-// never be used
-case class PaimonBucketedInputPartition(splits: Seq[Split], bucket: Int)
- extends PaimonInputPartition
diff --git
a/paimon-spark/paimon-spark-3.1/src/main/scala/org/apache/paimon/spark/PaimonScanBuilder.scala
b/paimon-spark/paimon-spark-3.1/src/main/scala/org/apache/paimon/spark/PaimonScanBuilder.scala
deleted file mode 100644
index 10b83ccf0..000000000
---
a/paimon-spark/paimon-spark-3.1/src/main/scala/org/apache/paimon/spark/PaimonScanBuilder.scala
+++ /dev/null
@@ -1,23 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements. See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership. The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.paimon.spark
-
-import org.apache.paimon.table.Table
-
-class PaimonScanBuilder(table: Table) extends PaimonBaseScanBuilder(table)
diff --git
a/paimon-spark/paimon-spark-3.1/src/main/scala/org/apache/paimon/spark/catalyst/analysis/PaimonIncompatiblePHRRules.scala
b/paimon-spark/paimon-spark-3.1/src/main/scala/org/apache/paimon/spark/catalyst/analysis/PaimonIncompatiblePHRRules.scala
deleted file mode 100644
index 7a1edf2f8..000000000
---
a/paimon-spark/paimon-spark-3.1/src/main/scala/org/apache/paimon/spark/catalyst/analysis/PaimonIncompatiblePHRRules.scala
+++ /dev/null
@@ -1,30 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements. See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership. The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.paimon.spark.catalyst.analysis
-
-import org.apache.spark.sql.SparkSession
-import org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
-import org.apache.spark.sql.catalyst.rules.Rule
-
-/** These post-hoc resolution rules are incompatible between different
versions of spark. */
-case class PaimonIncompatiblePHRRules(session: SparkSession) extends
Rule[LogicalPlan] {
-
- override def apply(plan: LogicalPlan): LogicalPlan = plan
-
-}
diff --git
a/paimon-spark/paimon-spark-3.1/src/main/scala/org/apache/paimon/spark/catalyst/analysis/PaimonIncompatibleResolutionRules.scala
b/paimon-spark/paimon-spark-3.1/src/main/scala/org/apache/paimon/spark/catalyst/analysis/PaimonIncompatibleResolutionRules.scala
deleted file mode 100644
index c8d0249da..000000000
---
a/paimon-spark/paimon-spark-3.1/src/main/scala/org/apache/paimon/spark/catalyst/analysis/PaimonIncompatibleResolutionRules.scala
+++ /dev/null
@@ -1,30 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements. See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership. The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.paimon.spark.catalyst.analysis
-
-import org.apache.spark.sql.SparkSession
-import org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
-import org.apache.spark.sql.catalyst.rules.Rule
-
-/** These resolution rules are incompatible between different versions of
spark. */
-case class PaimonIncompatibleResolutionRules(session: SparkSession) extends
Rule[LogicalPlan] {
-
- override def apply(plan: LogicalPlan): LogicalPlan = plan
-
-}
diff --git
a/paimon-spark/paimon-spark-3.1/src/main/scala/org/apache/paimon/spark/catalyst/analysis/PaimonMergeInto.scala
b/paimon-spark/paimon-spark-3.1/src/main/scala/org/apache/paimon/spark/catalyst/analysis/PaimonMergeInto.scala
deleted file mode 100644
index b16fbb727..000000000
---
a/paimon-spark/paimon-spark-3.1/src/main/scala/org/apache/paimon/spark/catalyst/analysis/PaimonMergeInto.scala
+++ /dev/null
@@ -1,57 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements. See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership. The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.paimon.spark.catalyst.analysis
-
-import org.apache.paimon.spark.SparkTable
-import org.apache.paimon.spark.commands.MergeIntoPaimonTable
-
-import org.apache.spark.sql.SparkSession
-import org.apache.spark.sql.catalyst.expressions.AttributeReference
-import org.apache.spark.sql.catalyst.plans.logical.{MergeAction,
MergeIntoTable}
-
-/** A post-hoc resolution rule for MergeInto. */
-case class PaimonMergeInto(spark: SparkSession) extends PaimonMergeIntoBase {
-
- override def resolveNotMatchedBySourceActions(
- merge: MergeIntoTable,
- targetOutput: Seq[AttributeReference]): Seq[MergeAction] = {
- Seq.empty
- }
-
- override def buildMergeIntoPaimonTable(
- v2Table: SparkTable,
- merge: MergeIntoTable,
- alignedMatchedActions: Seq[MergeAction],
- alignedNotMatchedActions: Seq[MergeAction],
- alignedNotMatchedBySourceActions: Seq[MergeAction]):
MergeIntoPaimonTable = {
- if (alignedNotMatchedBySourceActions.nonEmpty) {
- throw new RuntimeException("WHEN NOT MATCHED BY SOURCE is not supported
here.")
- }
-
- MergeIntoPaimonTable(
- v2Table,
- merge.targetTable,
- merge.sourceTable,
- merge.mergeCondition,
- alignedMatchedActions,
- alignedNotMatchedActions,
- alignedNotMatchedBySourceActions
- )
- }
-}
diff --git
a/paimon-spark/paimon-spark-3.1/src/main/scala/org/apache/paimon/spark/catalyst/trees/PaimonLeafLike.scala
b/paimon-spark/paimon-spark-3.1/src/main/scala/org/apache/paimon/spark/catalyst/trees/PaimonLeafLike.scala
deleted file mode 100644
index 38d56bd31..000000000
---
a/paimon-spark/paimon-spark-3.1/src/main/scala/org/apache/paimon/spark/catalyst/trees/PaimonLeafLike.scala
+++ /dev/null
@@ -1,28 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements. See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership. The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.paimon.spark.catalyst.trees
-
-import org.apache.spark.sql.catalyst.trees.TreeNode
-
-trait PaimonLeafLike[T <: TreeNode[T]] {
- self: TreeNode[T] =>
- final override def children: Seq[T] = Nil
-
- final override def mapChildren(f: T => T): T = this.asInstanceOf[T]
-}
diff --git
a/paimon-spark/paimon-spark-3.1/src/main/scala/org/apache/paimon/spark/leafnode/package.scala
b/paimon-spark/paimon-spark-3.1/src/main/scala/org/apache/paimon/spark/leafnode/package.scala
deleted file mode 100644
index 870aa13f6..000000000
---
a/paimon-spark/paimon-spark-3.1/src/main/scala/org/apache/paimon/spark/leafnode/package.scala
+++ /dev/null
@@ -1,37 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements. See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership. The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.paimon.spark
-
-import org.apache.paimon.spark.catalyst.trees.PaimonLeafLike
-
-import org.apache.spark.sql.catalyst.plans.logical.{Command, LogicalPlan,
ParsedStatement}
-import org.apache.spark.sql.execution.SparkPlan
-import org.apache.spark.sql.execution.command.RunnableCommand
-import org.apache.spark.sql.execution.datasources.v2.V2CommandExec
-
-package object leafnode {
- trait PaimonLeafParsedStatement extends ParsedStatement with
PaimonLeafLike[LogicalPlan]
-
- trait PaimonLeafRunnableCommand extends RunnableCommand with
PaimonLeafLike[LogicalPlan]
-
- trait PaimonLeafCommand extends Command with PaimonLeafLike[LogicalPlan]
-
- trait PaimonLeafV2CommandExec extends V2CommandExec with
PaimonLeafLike[SparkPlan]
-
-}
diff --git
a/paimon-spark/paimon-spark-3.1/src/main/scala/org/apache/paimon/spark/statistics/StatisticsHelper.scala
b/paimon-spark/paimon-spark-3.1/src/main/scala/org/apache/paimon/spark/statistics/StatisticsHelper.scala
deleted file mode 100644
index e64785dde..000000000
---
a/paimon-spark/paimon-spark-3.1/src/main/scala/org/apache/paimon/spark/statistics/StatisticsHelper.scala
+++ /dev/null
@@ -1,30 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements. See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership. The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.paimon.spark.statistics
-
-import org.apache.spark.sql.catalyst.expressions.Attribute
-import org.apache.spark.sql.catalyst.plans.logical
-import org.apache.spark.sql.connector.read.Statistics
-import org.apache.spark.sql.execution.datasources.v2.DataSourceV2Relation
-
-trait StatisticsHelper extends StatisticsHelperBase {
- protected def toV1Stats(v2Stats: Statistics, attrs: Seq[Attribute]):
logical.Statistics = {
- DataSourceV2Relation.transformV2Stats(v2Stats, None,
conf.defaultSizeInBytes)
- }
-}
diff --git
a/paimon-spark/paimon-spark-3.1/src/main/scala/org/apache/paimon/spark/util/SQLConfUtils.scala
b/paimon-spark/paimon-spark-3.1/src/main/scala/org/apache/paimon/spark/util/SQLConfUtils.scala
deleted file mode 100644
index ca402d794..000000000
---
a/paimon-spark/paimon-spark-3.1/src/main/scala/org/apache/paimon/spark/util/SQLConfUtils.scala
+++ /dev/null
@@ -1,28 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements. See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership. The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.paimon.spark.util
-
-import org.apache.paimon.catalog.Catalog
-
-import org.apache.spark.sql.internal.SQLConf
-
-/** SQLConf utils. */
-object SQLConfUtils {
- def defaultDatabase(sqlConf: SQLConf): String = Catalog.DEFAULT_DATABASE
-}
diff --git
a/paimon-spark/paimon-spark-3.1/src/main/scala/org/apache/paimon/spark/util/shim/TypeUtils.scala
b/paimon-spark/paimon-spark-3.1/src/main/scala/org/apache/paimon/spark/util/shim/TypeUtils.scala
deleted file mode 100644
index dcd2d6889..000000000
---
a/paimon-spark/paimon-spark-3.1/src/main/scala/org/apache/paimon/spark/util/shim/TypeUtils.scala
+++ /dev/null
@@ -1,25 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements. See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership. The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.paimon.spark.util.shim
-
-object TypeUtils {
-
- // Since Spark 3.3 and below do not support timestamp ntz, treat Paimon
TimestampType as Spark TimestampType
- def treatPaimonTimestampTypeAsSparkTimestampType(): Boolean = true
-}
diff --git
a/paimon-spark/paimon-spark-3.1/src/test/java/org/apache/paimon/spark/SparkGenericCatalogTest.java
b/paimon-spark/paimon-spark-3.1/src/test/java/org/apache/paimon/spark/SparkGenericCatalogTest.java
deleted file mode 100644
index bb3e81c9d..000000000
---
a/paimon-spark/paimon-spark-3.1/src/test/java/org/apache/paimon/spark/SparkGenericCatalogTest.java
+++ /dev/null
@@ -1,123 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements. See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership. The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License. You may obtain a copy of the License at
- *
- * http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.paimon.spark;
-
-import org.apache.paimon.data.BinaryString;
-import org.apache.paimon.data.GenericRow;
-import org.apache.paimon.fs.Path;
-import org.apache.paimon.fs.local.LocalFileIO;
-import org.apache.paimon.spark.extensions.PaimonSparkSessionExtensions;
-import org.apache.paimon.table.FileStoreTable;
-import org.apache.paimon.table.FileStoreTableFactory;
-import org.apache.paimon.table.sink.BatchTableCommit;
-import org.apache.paimon.table.sink.BatchTableWrite;
-import org.apache.paimon.table.sink.BatchWriteBuilder;
-
-import org.apache.spark.sql.Row;
-import org.apache.spark.sql.SparkSession;
-import org.junit.jupiter.api.AfterAll;
-import org.junit.jupiter.api.BeforeAll;
-import org.junit.jupiter.api.Test;
-import org.junit.jupiter.api.io.TempDir;
-
-import java.util.List;
-
-import static org.assertj.core.api.Assertions.assertThat;
-
-/** Base tests for spark read. */
-public class SparkGenericCatalogTest {
-
- protected static SparkSession spark = null;
-
- protected static Path warehousePath = null;
-
- @BeforeAll
- public static void startMetastoreAndSpark(@TempDir java.nio.file.Path
tempDir) {
- warehousePath = new Path("file:" + tempDir.toString());
- spark =
- SparkSession.builder()
- .config("spark.sql.warehouse.dir",
warehousePath.toString())
- .config(
- "spark.sql.extensions",
- PaimonSparkSessionExtensions.class.getName())
- .master("local[2]")
- .getOrCreate();
- spark.conf().set("spark.sql.catalog.spark_catalog",
SparkGenericCatalog.class.getName());
- }
-
- @AfterAll
- public static void stopMetastoreAndSpark() {
- if (spark != null) {
- spark.stop();
- spark = null;
- }
- }
-
- @Test
- public void testPaimonTable() throws Exception {
- spark.sql(
- "CREATE TABLE PT (a INT, b INT, c STRING) USING paimon
TBLPROPERTIES"
- + " ('file.format'='avro')");
- writeTable(
- "PT",
- GenericRow.of(1, 2, BinaryString.fromString("3")),
- GenericRow.of(4, 5, BinaryString.fromString("6")));
- assertThat(spark.sql("SELECT * FROM
PT").collectAsList().stream().map(Object::toString))
- .containsExactlyInAnyOrder("[1,2,3]", "[4,5,6]");
-
- spark.sql("CREATE DATABASE my_db");
- spark.sql(
- "CREATE TABLE DB_PT (a INT, b INT, c STRING) USING paimon
TBLPROPERTIES"
- + " ('file.format'='avro')");
- writeTable(
- "DB_PT",
- GenericRow.of(1, 2, BinaryString.fromString("3")),
- GenericRow.of(4, 5, BinaryString.fromString("6")));
- assertThat(spark.sql("SELECT * FROM
DB_PT").collectAsList().stream().map(Object::toString))
- .containsExactlyInAnyOrder("[1,2,3]", "[4,5,6]");
-
- assertThat(spark.sql("SHOW
NAMESPACES").collectAsList().stream().map(Object::toString))
- .containsExactlyInAnyOrder("[default]", "[my_db]");
- }
-
- @Test
- public void testCsvTable() {
- spark.sql("CREATE TABLE CT (a INT, b INT, c STRING) USING csv");
- spark.sql("INSERT INTO CT VALUES (1, 2, '3'), (4, 5,
'6')").collectAsList();
- List<Row> rows = spark.sql("SELECT * FROM CT").collectAsList();
- assertThat(rows.stream().map(Object::toString))
- .containsExactlyInAnyOrder("[1,2,3]", "[4,5,6]");
- }
-
- private static void writeTable(String tableName, GenericRow... rows)
throws Exception {
- FileStoreTable fileStoreTable =
- FileStoreTableFactory.create(
- LocalFileIO.create(),
- new Path(warehousePath, String.format("default.db/%s",
tableName)));
- BatchWriteBuilder writeBuilder = fileStoreTable.newBatchWriteBuilder();
- BatchTableWrite writer = writeBuilder.newWrite();
- BatchTableCommit commit = writeBuilder.newCommit();
- for (GenericRow row : rows) {
- writer.write(row);
- }
- commit.commit(writer.prepareCommit());
- writer.close();
- commit.close();
- }
-}
diff --git
a/paimon-spark/paimon-spark-3.1/src/test/resources/log4j2-test.properties
b/paimon-spark/paimon-spark-3.1/src/test/resources/log4j2-test.properties
deleted file mode 100644
index 1b3980d15..000000000
--- a/paimon-spark/paimon-spark-3.1/src/test/resources/log4j2-test.properties
+++ /dev/null
@@ -1,28 +0,0 @@
-################################################################################
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements. See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership. The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License. You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-################################################################################
-
-# Set root logger level to OFF to not flood build logs
-# set manually to INFO for debugging purposes
-rootLogger.level = OFF
-rootLogger.appenderRef.test.ref = TestLogger
-
-appender.testlogger.name = TestLogger
-appender.testlogger.type = CONSOLE
-appender.testlogger.target = SYSTEM_ERR
-appender.testlogger.layout.type = PatternLayout
-appender.testlogger.layout.pattern = %-4r [%tid %t] %-5p %c %x - %m%n
diff --git a/paimon-spark/pom.xml b/paimon-spark/pom.xml
index efc34d67f..c06b3a3d4 100644
--- a/paimon-spark/pom.xml
+++ b/paimon-spark/pom.xml
@@ -42,7 +42,6 @@ under the License.
<module>paimon-spark-3.5</module>
<module>paimon-spark-3.4</module>
<module>paimon-spark-3.3</module>
- <module>paimon-spark-3.1</module>
<module>paimon-spark-3.2</module>
</modules>