If memory serves me correctly in 1.3.1 at least there was a problem with
when the driver was added -- the right classloader wasn't picking it up.
You can try searching the archives, but the issue is similar to these
threads:
http://stackoverflow.com/questions/30940566/connecting-from-spark-pyspark-to-postgresql
http://stackoverflow.com/questions/30221677/spark-sql-postgresql-jdbc-classpath-issues

I thought this was fixed in 1.4.1...but in any case, maybe try setting
SPARK_CLASSPATH explicitly if 1.4.1 is still a no-go...might be a PITA if
you're doing reads as you'd have to do it on each slave -- I'd try to run
with a single slave until you get this fixed...

On Fri, Sep 4, 2015 at 11:59 PM, Nicholas Connor <
nicholas.k.con...@gmail.com> wrote:

> So, I need to connect to multiple databases to do cool stuff with Spark.
> To do this, I need multiple database drivers: Postgres + MySQL.
>
> *Problem*: Spark fails to run both drivers
>
> This method works for one driver at a time:
>
> spark-submit **** --driver-class-path="/driver.jar"
>
> These methods do not work for one driver, or many (though Spark does say
> Added "driver.jar" with timestamp *** in the log):
>
>    - spark-submit --jars "driver1.jar, driver2.jar"
>    - sparkContext.addJar("driver.jar")
>    - echo 'spark.driver.extraClassPath="driver.jar"' >>
>    spark-defaults.conf
>    - echo 'spark.executor.extraClassPath="driver.jar"' >>
>    spark-defaults.conf
>    - sbt assembly (fat jar with drivers)
>
> *Example error:*
>
> Exception in thread "main" java.sql.SQLException: No suitable driver found
> for jdbc:mysql://**** at
> com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1055) at
> com.mysql.jdbc.SQLError.createSQLException(SQLError.java:956) at
> com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3491) at
> com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3423) at
> com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:910) at
> com.mysql.jdbc.MysqlIO.secureAuth411(MysqlIO.java:3923) at
> com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1273)
>
> *Versions Tested*: Spark 1.3.1 && 1.4.1
>
> What method can I use to load both drivers?
> Thanks,
>
> Nicholas Connor
>

Reply via email to