Graeme Edwards created SPARK-17249:
--------------------------------------

             Summary: java.lang.IllegalStateException: Did not find registered 
driver with class org.apache.spark.sql.execution.datasources.jdbc.DriverWrapper 
                 Key: SPARK-17249
                 URL: https://issues.apache.org/jira/browse/SPARK-17249
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 1.6.0
            Reporter: Graeme Edwards
            Priority: Minor


This issue is a corner case relating to SPARK-14162 that isn't fixed by that 
change.

It occurs when we:
- Are using Oracle's ojdbc 
- The driver is wrapping ojdbc with a DriverWrapper because it is added via the 
Spark class loader.
- We don't specify an explicit "driver" property

Then in /org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala 
(createConnectionFactory)

The driver will get the driverClass as:

 val driverClass: String = userSpecifiedDriverClass.getOrElse {
      DriverManager.getDriver(url).getClass.getCanonicalName
    }

Which since the Driver is wrapped by a DriverWrapper will be 
"org.apache.spark.sql.execution.datasources.jdbc.DriverWrapper"

That gets passed to the Executor which will attempt to find a matching wrapper 
with the name "org.apache.spark.sql.execution.datasources.jdbc.DriverWrapper". 
However the Executor is aware of the wrapping and will compare with the wrapped 
classname instead:

  case d: DriverWrapper if d.wrapped.getClass.getCanonicalName == driverClass 
=> d

I think the fix is just to change the initialization of driverClass to also be 
aware that there might be a wrapper and if so pass the wrapped classname.

The problem can be worked around by setting the driver property for the jdbc 
call:

val props = new java.util.Properties()
props.put("driver", "oracle.jdbc.OracleDriver")
val result = sqlContext.read.jdbc(connectionString, query, props)







--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to