This is an automated email from the ASF dual-hosted git repository.

yangjie01 pushed a commit to branch branch-4.1
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-4.1 by this push:
     new b21fb415b947 [SPARK-54565][CORE] SparkBuildInfo should load 
`spark-version-info.properties` from its own classloader
b21fb415b947 is described below

commit b21fb415b9472331e47e0cced9728b7bba2b29bd
Author: Cheng Pan <[email protected]>
AuthorDate: Wed Dec 3 19:32:17 2025 +0800

    [SPARK-54565][CORE] SparkBuildInfo should load 
`spark-version-info.properties` from its own classloader
    
    ### What changes were proposed in this pull request?
    
    Change SparkBuildInfo to use its own classloader instead of thread context 
classloader to load `spark-version-info.properties`.
    
    ### Why are the changes needed?
    
    I hit an issue during the Connect JDBC driver & JetBrains DataGrip 
integration.
    ```
    2025-11-25 18:48:09,475 [  55114]   WARN - 
#c.i.d.d.BaseDatabaseErrorHandler$MissingDriverClassErrorInfo - Exception 
org.apache.spark.SparkException: Could not find spark-version-info.properties 
[in thread "RMI TCP Connection(3)-127.0.0.1"]
    java.lang.ExceptionInInitializerError: Exception 
org.apache.spark.SparkException: Could not find spark-version-info.properties 
[in thread "RMI TCP Connection(3)-127.0.0.1"]
            at 
org.apache.spark.SparkBuildInfo$.<clinit>(SparkBuildInfo.scala:35)
            at 
org.apache.spark.sql.connect.client.SparkConnectClient$.org$apache$spark$sql$connect$client$SparkConnectClient$$genUserAgent(SparkConnectClient.scala:978)
            at 
org.apache.spark.sql.connect.client.SparkConnectClient$Configuration$.apply$default$8(SparkConnectClient.scala:999)
            at 
org.apache.spark.sql.connect.client.SparkConnectClient$Builder.<init>(SparkConnectClient.scala:683)
            at 
org.apache.spark.sql.connect.client.SparkConnectClient$.builder(SparkConnectClient.scala:676)
            at 
org.apache.spark.sql.connect.client.jdbc.SparkConnectConnection.<init>(SparkConnectConnection.scala:31)
            at 
org.apache.spark.sql.connect.client.jdbc.NonRegisteringSparkConnectDriver.connect(NonRegisteringSparkConnectDriver.scala:36)
            at 
com.intellij.database.remote.jdbc.helpers.JdbcHelperImpl.connect(JdbcHelperImpl.java:786)
            at 
com.intellij.database.remote.jdbc.impl.RemoteDriverImpl.connect(RemoteDriverImpl.java:47)
    ```
    
    After adding some debug messages, I found it was caused by using wrong 
classloader.
    
    ```
    c.i.e.r.RemoteProcessSupport - ContextClassLoader: 
com.intellij.database.remote.jdbc.impl.JdbcClassLoader$1559cc356
    c.i.e.r.RemoteProcessSupport - SparkBuildInfo ClassLoader: 
com.intellij.database.remote.jdbc.impl.JdbcClassLoader$JdbcClassLoaderImpl62e93ea8
    ```
    
    Similar issue that affects Hive JDBC driver and Spark's Isolated 
Classloader (see SPARK-32256) was fixed by 
[HADOOP-14067](https://issues.apache.org/jira/browse/HADOOP-14067)
    
    ### Does this PR introduce _any_ user-facing change?
    
    This fixes corner issues that the application uses multiple classloaders 
with Spark libs.
    
    ### How was this patch tested?
    
    Pass GHA to ensure the change breaks nothing, also manually verified the 
Connect JDBC driver & JetBrains DataGrip integration.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #53279 from pan3793/SPARK-54565.
    
    Lead-authored-by: Cheng Pan <[email protected]>
    Co-authored-by: Cheng Pan <[email protected]>
    Signed-off-by: yangjie01 <[email protected]>
    (cherry picked from commit 095c2c3a081fff1dc31e2c5832f6d8907cfc0fab)
    Signed-off-by: yangjie01 <[email protected]>
---
 common/utils/src/main/scala/org/apache/spark/SparkBuildInfo.scala | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/common/utils/src/main/scala/org/apache/spark/SparkBuildInfo.scala 
b/common/utils/src/main/scala/org/apache/spark/SparkBuildInfo.scala
index ebc62460d231..7618105bd72e 100644
--- a/common/utils/src/main/scala/org/apache/spark/SparkBuildInfo.scala
+++ b/common/utils/src/main/scala/org/apache/spark/SparkBuildInfo.scala
@@ -29,8 +29,8 @@ private[spark] object SparkBuildInfo {
     spark_build_date: String,
     spark_doc_root: String) = {
 
-    val resourceStream = Thread.currentThread().getContextClassLoader.
-      getResourceAsStream("spark-version-info.properties")
+    val resourceStream = getClass.getClassLoader
+      .getResourceAsStream("spark-version-info.properties")
     if (resourceStream == null) {
       throw new SparkException("Could not find spark-version-info.properties")
     }


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to