I recently had this same issue. Though I didn't find the cause, I was able
to work around it by loading the JAR into hdfs. Once in HDFS, I used the
--jars flag with the full hdfs path: --jars hdfs://{our
namenode}/tmp/postgresql-9.4-1204-jdbc42.jar

James

On Fri, Nov 13, 2015 at 10:14 AM satish chandra j <jsatishchan...@gmail.com>
wrote:

> HI All,
> Currently using Spark 1.4.1, my Spark job has to fetche data from
> PostgreSQL database using JdbcRDD
> I am submitting my spark job using --jars to pass PostgreSQL JDBC driver
> but still getting error as mentioned below:
>
> "java.sql.SQLException: No suitable driver found for PostgreSQL JDBC"
>
> when the same is given through Spark Shell it is working fine
>
> In several blogs it is mentioned that it is fixed in Spark 1.4.1 by just
> passing JDBC Driver through --jars option but still i am stuck
>
> I have tried below options:
>
>
>    1. SPARK_CLASSPATH= /path/postgresql.jar in
>    spark/conf/spark-defaults.conf
>
>      2.   -driver-class-path /path/postgresql.jar and -conf
> spark.executor.extraClassPath = /path/postgreSQL.jar
>
>      3.   --jars /path/postgreSQL,jar
>
>      4.  Currently trying to add SPARK_CLASSPATH in file
> "compute_classpath.sh" for each node of cluster
>
> Please let me know if any inputs on the same to proceed further
>
> Regards
> Satish Chandra
>

Reply via email to