Repository: spark
Updated Branches:
  refs/heads/branch-1.6 714f4d78a -> 0a13e4c07


[SPARK-14204][SQL] register driverClass rather than user-specified class

This pull request fixes an issue in which cluster-mode executors fail to 
properly register a JDBC driver when the driver is provided in a jar by the 
user, but the driver class name is derived from a JDBC URL (rather than 
specified by the user).  The consequence of this is that all JDBC accesses 
under the described circumstances fail with an `IllegalStateException`. I 
reported the issue here: https://issues.apache.org/jira/browse/SPARK-14204

My proposed solution is to have the executors register the JDBC driver class 
under all circumstances, not only when the driver is specified by the user.

This patch was tested manually.  I built an assembly jar, deployed it to a 
cluster, and confirmed that the problem was fixed.

Author: Kevin McHale <ke...@premise.com>

Closes #12000 from mchalek/jdbc-driver-registration.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/0a13e4c0
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/0a13e4c0
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/0a13e4c0

Branch: refs/heads/branch-1.6
Commit: 0a13e4c0712fb83525eb5acbf55aabc4c9891ff7
Parents: 714f4d7
Author: Kevin McHale <ke...@premise.com>
Authored: Thu Jun 2 11:17:33 2016 -0500
Committer: Sean Owen <so...@cloudera.com>
Committed: Thu Jun 2 11:17:33 2016 -0500

----------------------------------------------------------------------
 .../apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala    | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/0a13e4c0/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala
----------------------------------------------------------------------
diff --git 
a/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala
 
b/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala
index 10f6506..c0aede0 100644
--- 
a/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala
+++ 
b/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala
@@ -50,7 +50,7 @@ object JdbcUtils extends Logging {
       DriverManager.getDriver(url).getClass.getCanonicalName
     }
     () => {
-      userSpecifiedDriverClass.foreach(DriverRegistry.register)
+      DriverRegistry.register(driverClass)
       val driver: Driver = DriverManager.getDrivers.asScala.collectFirst {
         case d: DriverWrapper if d.wrapped.getClass.getCanonicalName == 
driverClass => d
         case d if d.getClass.getCanonicalName == driverClass => d


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to