This is an automated email from the ASF dual-hosted git repository.
dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new 8fd58ed00505 [SPARK-54276][BUILD] Bump Hadoop 3.4.3
8fd58ed00505 is described below
commit 8fd58ed005053632199522f75cef7e7d5242640a
Author: Cheng Pan <[email protected]>
AuthorDate: Tue Feb 24 18:14:37 2026 -0800
[SPARK-54276][BUILD] Bump Hadoop 3.4.3
### What changes were proposed in this pull request?
Upgrade Hadoop dependency to 3.4.3.
### Why are the changes needed?
This release includes HADOOP-19212, which makes UGI work with Java 25.
https://hadoop.apache.org/release/3.4.3.html
### Does this PR introduce _any_ user-facing change?
No.
### How was this patch tested?
Pass CI. Also verified `spark-sql` can successfully bootstrap on JDK 25 now
```
$ java -version
openjdk version "25.0.1" 2025-10-21 LTS
OpenJDK Runtime Environment Temurin-25.0.1+8 (build 25.0.1+8-LTS)
OpenJDK 64-Bit Server VM Temurin-25.0.1+8 (build 25.0.1+8-LTS, mixed mode,
sharing)
$ build/sbt -Phive,hive-thriftserver clean package
$ SPARK_PREPEND_CLASSES=true bin/spark-sql
NOTE: SPARK_PREPEND_CLASSES is set, placing locally compiled Spark classes
ahead of assembly.
WARNING: Using incubator modules: jdk.incubator.vector
WARNING: package sun.security.action not in java.base
Using Spark's default log4j profile:
org/apache/spark/log4j2-defaults.properties
26/01/28 17:23:22 WARN Utils: Your hostname, H27212-MAC-01.local, resolves
to a loopback address: 127.0.0.1; using 10.242.159.140 instead (on interface
en0)
26/01/28 17:23:22 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to
another address
Using Spark's default log4j profile:
org/apache/spark/log4j2-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use
setLogLevel(newLevel).
26/01/28 17:23:23 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
WARNING: A terminally deprecated method in sun.misc.Unsafe has been called
WARNING: sun.misc.Unsafe::arrayBaseOffset has been called by
org.apache.spark.unsafe.Platform
(file:/Users/chengpan/Projects/apache-spark/common/unsafe/target/scala-2.13/classes/)
WARNING: Please consider reporting this to the maintainers of class
org.apache.spark.unsafe.Platform
WARNING: sun.misc.Unsafe::arrayBaseOffset will be removed in a future
release
26/01/28 17:23:27 WARN ObjectStore: Version information not found in
metastore. hive.metastore.schema.verification is not enabled so recording the
schema version 2.3.0
26/01/28 17:23:27 WARN ObjectStore: setMetaStoreSchemaVersion called but
recording version is disabled: version = 2.3.0, comment = Set by MetaStore
chengpan127.0.0.1
26/01/28 17:23:27 WARN ObjectStore: Failed to get database default,
returning NoSuchObjectException
Spark Web UI available at http://10.242.159.140:4040
Spark master: local[*], Application Id: local-1769592205115
spark-sql (default)> select version();
4.2.0 14557582199659d838bbaa7d7b182e5d92c3b907
Time taken: 1.376 seconds, Fetched 1 row(s)
spark-sql (default)>
```
### Was this patch authored or co-authored using generative AI tooling?
No.
Closes #54029 from pan3793/SPARK-54276.
Authored-by: Cheng Pan <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
---
dev/deps/spark-deps-hadoop-3-hive-2.3 | 20 ++++++++++----------
docs/building-spark.md | 2 +-
pom.xml | 2 +-
.../kubernetes/integration-tests/README.md | 4 ++--
.../spark/sql/hive/client/IsolatedClientLoader.scala | 2 +-
5 files changed, 15 insertions(+), 15 deletions(-)
diff --git a/dev/deps/spark-deps-hadoop-3-hive-2.3
b/dev/deps/spark-deps-hadoop-3-hive-2.3
index 7f7039c7f0e6..fcb6983898fe 100644
--- a/dev/deps/spark-deps-hadoop-3-hive-2.3
+++ b/dev/deps/spark-deps-hadoop-3-hive-2.3
@@ -68,16 +68,16 @@
gcs-connector/hadoop3-2.2.31/shaded/gcs-connector-hadoop3-2.2.31-shaded.jar
gmetric4j/1.0.10//gmetric4j-1.0.10.jar
gson/2.13.2//gson-2.13.2.jar
guava/33.4.8-jre//guava-33.4.8-jre.jar
-hadoop-aliyun/3.4.2//hadoop-aliyun-3.4.2.jar
-hadoop-annotations/3.4.2//hadoop-annotations-3.4.2.jar
-hadoop-aws/3.4.2//hadoop-aws-3.4.2.jar
-hadoop-azure-datalake/3.4.2//hadoop-azure-datalake-3.4.2.jar
-hadoop-azure/3.4.2//hadoop-azure-3.4.2.jar
-hadoop-client-api/3.4.2//hadoop-client-api-3.4.2.jar
-hadoop-client-runtime/3.4.2//hadoop-client-runtime-3.4.2.jar
-hadoop-cloud-storage/3.4.2//hadoop-cloud-storage-3.4.2.jar
-hadoop-huaweicloud/3.4.2//hadoop-huaweicloud-3.4.2.jar
-hadoop-shaded-guava/1.4.0//hadoop-shaded-guava-1.4.0.jar
+hadoop-aliyun/3.4.3//hadoop-aliyun-3.4.3.jar
+hadoop-annotations/3.4.3//hadoop-annotations-3.4.3.jar
+hadoop-aws/3.4.3//hadoop-aws-3.4.3.jar
+hadoop-azure-datalake/3.4.3//hadoop-azure-datalake-3.4.3.jar
+hadoop-azure/3.4.3//hadoop-azure-3.4.3.jar
+hadoop-client-api/3.4.3//hadoop-client-api-3.4.3.jar
+hadoop-client-runtime/3.4.3//hadoop-client-runtime-3.4.3.jar
+hadoop-cloud-storage/3.4.3//hadoop-cloud-storage-3.4.3.jar
+hadoop-huaweicloud/3.4.3//hadoop-huaweicloud-3.4.3.jar
+hadoop-shaded-guava/1.5.0//hadoop-shaded-guava-1.5.0.jar
hive-beeline/2.3.10//hive-beeline-2.3.10.jar
hive-cli/2.3.10//hive-cli-2.3.10.jar
hive-common/2.3.10//hive-common-2.3.10.jar
diff --git a/docs/building-spark.md b/docs/building-spark.md
index 57fa3c2bd6a4..1e7c28a273e8 100644
--- a/docs/building-spark.md
+++ b/docs/building-spark.md
@@ -83,7 +83,7 @@ You can enable the `yarn` profile and specify the exact
version of Hadoop to com
Example:
- ./build/mvn -Pyarn -Dhadoop.version=3.4.1 -DskipTests clean package
+ ./build/mvn -Pyarn -Dhadoop.version=3.4.3 -DskipTests clean package
## Building With Hive and JDBC Support
diff --git a/pom.xml b/pom.xml
index 436edd65b879..92221fee4850 100644
--- a/pom.xml
+++ b/pom.xml
@@ -127,7 +127,7 @@
<slf4j.version>2.0.17</slf4j.version>
<log4j.version>2.25.3</log4j.version>
<!-- make sure to update IsolatedClientLoader whenever this version is
changed -->
- <hadoop.version>3.4.2</hadoop.version>
+ <hadoop.version>3.4.3</hadoop.version>
<!-- SPARK-41247: When updating `protobuf.version`, also need to update
`protoVersion` in `SparkBuild.scala` -->
<protobuf.version>4.33.5</protobuf.version>
<protoc-jar-maven-plugin.version>3.11.4</protoc-jar-maven-plugin.version>
diff --git a/resource-managers/kubernetes/integration-tests/README.md
b/resource-managers/kubernetes/integration-tests/README.md
index d21c619c31b0..8f50b9ca7354 100644
--- a/resource-managers/kubernetes/integration-tests/README.md
+++ b/resource-managers/kubernetes/integration-tests/README.md
@@ -136,8 +136,8 @@ properties to Maven. For example:
mvn integration-test -am -pl :spark-kubernetes-integration-tests_2.13 \
-Pkubernetes -Pkubernetes-integration-tests \
- -Phadoop-3 -Dhadoop.version=3.4.0 \
-
-Dspark.kubernetes.test.sparkTgz=spark-4.1.0-SNAPSHOT-bin-example.tgz \
+ -Phadoop-3 -Dhadoop.version=3.4.3 \
+
-Dspark.kubernetes.test.sparkTgz=spark-4.2.0-SNAPSHOT-bin-example.tgz \
-Dspark.kubernetes.test.imageTag=sometag \
-Dspark.kubernetes.test.imageRepo=docker.io/somerepo \
-Dspark.kubernetes.test.namespace=spark-int-tests \
diff --git
a/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/IsolatedClientLoader.scala
b/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/IsolatedClientLoader.scala
index c439dfbd9169..8460bdd43fb0 100644
---
a/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/IsolatedClientLoader.scala
+++
b/sql/hive/src/main/scala/org/apache/spark/sql/hive/client/IsolatedClientLoader.scala
@@ -65,7 +65,7 @@ private[hive] object IsolatedClientLoader extends Logging {
case e: RuntimeException if e.getMessage.contains("hadoop") =>
// If the error message contains hadoop, it is probably because
the hadoop
// version cannot be resolved.
- val fallbackVersion = "3.4.2"
+ val fallbackVersion = "3.4.3"
logWarning(log"Failed to resolve Hadoop artifacts for the version
" +
log"${MDC(HADOOP_VERSION, hadoopVersion)}. We will change the
hadoop version from " +
log"${MDC(HADOOP_VERSION, hadoopVersion)} to " +
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]