This is an automated email from the ASF dual-hosted git repository.

emaynard pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/polaris.git


The following commit(s) were added to refs/heads/main by this push:
     new f5871c58f Revert "Reuse shadowJar for spark client bundle jar maven 
publish (#1857)" (#1921)
f5871c58f is described below

commit f5871c58f5359ac671cdcd0908d7bcb00729678b
Author: Yun Zou <yunzou.colost...@gmail.com>
AuthorDate: Sun Jun 22 08:24:52 2025 -0700

    Revert "Reuse shadowJar for spark client bundle jar maven publish (#1857)" 
(#1921)
    
    …857)"
    
    This reverts commit 1f7f127536a088911bf940addd1d05c07ff99a68.
    
    The shadowJar plugin actually stops publish the original jar, which is not 
what spark client intend to publish for the --package usage.
    
    Revert it for now, will follow up with a better way to reuse the shadow jar 
plugin, likely with a separate bundle project
---
 .../src/main/kotlin/publishing/PublishingHelperPlugin.kt   |  5 +++++
 plugins/spark/README.md                                    |  8 +++-----
 plugins/spark/v3.5/spark/build.gradle.kts                  | 14 ++++----------
 site/content/in-dev/unreleased/polaris-spark-client.md     | 11 -----------
 4 files changed, 12 insertions(+), 26 deletions(-)

diff --git a/build-logic/src/main/kotlin/publishing/PublishingHelperPlugin.kt 
b/build-logic/src/main/kotlin/publishing/PublishingHelperPlugin.kt
index 04b04225e..d4d412a30 100644
--- a/build-logic/src/main/kotlin/publishing/PublishingHelperPlugin.kt
+++ b/build-logic/src/main/kotlin/publishing/PublishingHelperPlugin.kt
@@ -133,6 +133,11 @@ constructor(private val softwareComponentFactory: 
SoftwareComponentFactory) : Pl
 
                 suppressPomMetadataWarningsFor("testFixturesApiElements")
                 suppressPomMetadataWarningsFor("testFixturesRuntimeElements")
+
+                if (project.tasks.findByName("createPolarisSparkJar") != null) 
{
+                  // if the project contains spark client jar, also publish 
the jar to maven
+                  artifact(project.tasks.named("createPolarisSparkJar").get())
+                }
               }
 
               if (
diff --git a/plugins/spark/README.md b/plugins/spark/README.md
index 3f4acc31c..c7d6bc876 100644
--- a/plugins/spark/README.md
+++ b/plugins/spark/README.md
@@ -29,17 +29,15 @@ Right now, the plugin only provides support for Spark 3.5, 
Scala version 2.12 an
 and depends on iceberg-spark-runtime 1.9.0.
 
 # Build Plugin Jar
-A shadowJar task is added to build a jar for the Polaris Spark plugin, the jar 
is named as:
+A task createPolarisSparkJar is added to build a jar for the Polaris Spark 
plugin, the jar is named as:
 `polaris-spark-<sparkVersion>_<scalaVersion>-<polarisVersion>-bundle.jar`. For 
example:
 `polaris-spark-3.5_2.12-0.11.0-beta-incubating-SNAPSHOT-bundle.jar`.
 
-- `./gradlew :polaris-spark-3.5_2.12:shadowJar` -- build jar for Spark 3.5 
with Scala version 2.12.
-- `./gradlew :polaris-spark-3.5_2.13:shadowJar` -- build jar for Spark 3.5 
with Scala version 2.13.
+- `./gradlew :polaris-spark-3.5_2.12:createPolarisSparkJar` -- build jar for 
Spark 3.5 with Scala version 2.12.
+- `./gradlew :polaris-spark-3.5_2.13:createPolarisSparkJar` -- build jar for 
Spark 3.5 with Scala version 2.13.
 
 The result jar is located at plugins/spark/v3.5/build/<scala_version>/libs 
after the build.
 
-The shadowJar task is also executed automatically when you run `gradlew 
assemble` or `gradlew build`.
-
 # Start Spark with Local Polaris Service using built Jar
 Once the jar is built, we can manually test it with Spark and a local Polaris 
service.
 
diff --git a/plugins/spark/v3.5/spark/build.gradle.kts 
b/plugins/spark/v3.5/spark/build.gradle.kts
index c328bb23e..a2a54e26b 100644
--- a/plugins/spark/v3.5/spark/build.gradle.kts
+++ b/plugins/spark/v3.5/spark/build.gradle.kts
@@ -19,10 +19,7 @@
 
 import com.github.jengelman.gradle.plugins.shadow.tasks.ShadowJar
 
-plugins {
-  id("polaris-client")
-  id("com.gradleup.shadow")
-}
+plugins { id("polaris-client") }
 
 // get version information
 val sparkMajorVersion = "3.5"
@@ -115,7 +112,7 @@ dependencies {
   }
 }
 
-tasks.named<ShadowJar>("shadowJar") {
+tasks.register<ShadowJar>("createPolarisSparkJar") {
   archiveClassifier = "bundle"
   isZip64 = true
 
@@ -138,11 +135,8 @@ tasks.named<ShadowJar>("shadowJar") {
     exclude(dependency("org.apache.avro:avro*.*"))
   }
 
-  relocate("com.fasterxml", "org.apache.polaris.shaded.com.fasterxml")
+  relocate("com.fasterxml", "org.apache.polaris.shaded.com.fasterxml.jackson")
   relocate("org.apache.avro", "org.apache.polaris.shaded.org.apache.avro")
 }
 
-// ensure the shadowJar job is run for both `assemble` and `build` task
-tasks.named("assemble") { dependsOn("shadowJar") }
-
-tasks.named("build") { dependsOn("shadowJar") }
+tasks.withType(Jar::class).named("sourcesJar") { 
dependsOn("createPolarisSparkJar") }
diff --git a/site/content/in-dev/unreleased/polaris-spark-client.md 
b/site/content/in-dev/unreleased/polaris-spark-client.md
index a34bceece..4ceb536a9 100644
--- a/site/content/in-dev/unreleased/polaris-spark-client.md
+++ b/site/content/in-dev/unreleased/polaris-spark-client.md
@@ -128,14 +128,3 @@ The Polaris Spark client has the following functionality 
limitations:
 3) Rename a Delta table is not supported.
 4) ALTER TABLE ... SET LOCATION is not supported for DELTA table.
 5) For other non-Iceberg tables like csv, it is not supported.
-
-## Iceberg Spark Client compatibility with Polaris Spark Client
-The Polaris Spark client today depends on a specific Iceberg client version, 
and the version dependency is described
-in the following table:
-
-| Spark Client Version | Iceberg Spark Client Version |
-|----------------------|------------------------------|
-| 1.0.0                | 1.9.0                        |
-
-The Iceberg dependency is automatically downloaded when the Polaris package is 
downloaded, so there is no need to
-add the Iceberg Spark client in the `packages` configuration.

Reply via email to