Yes, you would need to add the MySQL driver jar to the Spark driver &
executor classpath.
Either using the deprecated SPARK_CLASSPATH environment variable (which the
latest docs still recommend anyway although its deprecated) like so
>export SPARK_CLASSPATH=/usr/share/java/mysql-connector.jar
>spark-shell or spark-sql

The other un-deprecated way is to set the below variables in
spark-defaults.conf
spark.driver.extraClassPath  /usr/share/java/mysql-connector.jar
spark.executor.extraClassPath  /usr/share/java/mysql-connector.jar

and then either run spark-shell or spark-sql or start-thriftserver and
beeline

Things would be better once SPARK-6966 is merged into 1.4.0 when you can use 
1. use the --jars parameter for spark-shell, spark-sql, etc or
2. sc.addJar to add the driver after starting spark-shell.

Good Luck,
Anand Mohan



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/spark1-3-1-using-mysql-error-tp22643p22658.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to