This is an automated email from the ASF dual-hosted git repository.
wenchen pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new 533d9c397f9e [SPARK-51146][INFRA] Publish a new Spark distribution
with Spark Connect enabled (extra tarball)
533d9c397f9e is described below
commit 533d9c397f9ea7fb9fc08b4ae8980cb5cb15ad68
Author: Wenchen Fan <[email protected]>
AuthorDate: Tue Feb 11 19:52:01 2025 +0800
[SPARK-51146][INFRA] Publish a new Spark distribution with Spark Connect
enabled (extra tarball)
### What changes were proposed in this pull request?
This is the second step to publish a new Spark distribution with Spark
Connect enabled. A new tarball will be published with Spark Connect enabled and
other default settings (Hadoop 3 and Scala 2.13).
The new PyPI package will be added later.
### Why are the changes needed?
new Spark Connect distribution
### Does this PR introduce _any_ user-facing change?
no
### How was this patch tested?
manually tested with dry-run mode.
### Was this patch authored or co-authored using generative AI tooling?
no
Closes #49885 from cloud-fan/release.
Authored-by: Wenchen Fan <[email protected]>
Signed-off-by: Wenchen Fan <[email protected]>
---
.../main/scala/org/apache/spark/internal/config/package.scala | 3 ++-
dev/create-release/release-build.sh | 10 ++++++++++
dev/make-distribution.sh | 9 +++------
3 files changed, 15 insertions(+), 7 deletions(-)
diff --git a/core/src/main/scala/org/apache/spark/internal/config/package.scala
b/core/src/main/scala/org/apache/spark/internal/config/package.scala
index fc3f22abd9d7..5bf7e5f9536c 100644
--- a/core/src/main/scala/org/apache/spark/internal/config/package.scala
+++ b/core/src/main/scala/org/apache/spark/internal/config/package.scala
@@ -2831,5 +2831,6 @@ package object config {
.stringConf
.transform(_.toLowerCase(Locale.ROOT))
.checkValues(Set("connect", "classic"))
- .createWithDefault("classic")
+ .createWithDefault(
+ if (sys.env.get("SPARK_CONNECT_MODE").contains("1")) "connect" else
"classic")
}
diff --git a/dev/create-release/release-build.sh
b/dev/create-release/release-build.sh
index a378f790572b..8b0106696ee9 100755
--- a/dev/create-release/release-build.sh
+++ b/dev/create-release/release-build.sh
@@ -338,6 +338,16 @@ if [[ "$1" == "package" ]]; then
--output spark-$SPARK_VERSION-bin-$NAME.tgz.asc \
--detach-sig spark-$SPARK_VERSION-bin-$NAME.tgz
shasum -a 512 spark-$SPARK_VERSION-bin-$NAME.tgz >
spark-$SPARK_VERSION-bin-$NAME.tgz.sha512
+
+ if [[ -n $SPARK_CONNECT_FLAG ]]; then
+ echo "Copying and signing Spark Connect binary distribution"
+ SPARK_CONNECT_DIST_NAME=spark-$SPARK_VERSION-bin-$NAME-spark-connect.tgz
+ cp spark-$SPARK_VERSION-bin-$NAME/$SPARK_CONNECT_DIST_NAME .
+ echo $GPG_PASSPHRASE | $GPG --passphrase-fd 0 --armour \
+ --output $SPARK_CONNECT_DIST_NAME.asc \
+ --detach-sig $SPARK_CONNECT_DIST_NAME
+ shasum -a 512 $SPARK_CONNECT_DIST_NAME > $SPARK_CONNECT_DIST_NAME.sha512
+ fi
}
# List of binary packages built. Populates two associative arrays, where the
key is the "name" of
diff --git a/dev/make-distribution.sh b/dev/make-distribution.sh
index 46509dc530fc..d9f4148c24eb 100755
--- a/dev/make-distribution.sh
+++ b/dev/make-distribution.sh
@@ -317,12 +317,9 @@ if [ "$MAKE_TGZ" == "true" ]; then
TARDIR="$SPARK_HOME/$TARDIR_NAME"
rm -rf "$TARDIR"
cp -r "$DISTDIR" "$TARDIR"
- sed -i -e '$s/.*/export SPARK_CONNECT_MODE=1\
- &/' "$TARDIR/bin/pyspark"
- sed -i -e '$s/.*/export SPARK_CONNECT_MODE=1\
- &/' "$TARDIR/bin/spark-shell"
- sed -i -e '$s/.*/export SPARK_CONNECT_MODE=1\
- &/' "$TARDIR/bin/spark-submit"
+ sed -i -e '$s/.*/export SPARK_CONNECT_MODE=1\n&/' "$TARDIR/bin/pyspark"
+ sed -i -e '$s/.*/export SPARK_CONNECT_MODE=1\n&/' "$TARDIR/bin/spark-shell"
+ sed -i -e '$s/.*/export SPARK_CONNECT_MODE=1\n&/'
"$TARDIR/bin/spark-submit"
$TAR -czf "$TARDIR_NAME.tgz" -C "$SPARK_HOME" "$TARDIR_NAME"
rm -rf "$TARDIR"
fi
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]