I am using spark 1.3 standalone cluster on my local windows and trying to
load data from one of our server. Below is my code -

import os
os.environ['SPARK_CLASSPATH'] =
"C:\Users\ACERNEW3\Desktop\Spark\spark-1.3.0-bin-hadoop2.4\postgresql-9.2-1002.jdbc3.jar"

from pyspark import SparkContext, SparkConf, SQLContext

sc = SparkContext(appName = "SampleApp")
sqlContext = SQLContext(sc)
df = sqlContext.load(source="jdbc",
url="jdbc:postgresql://54.189.136.218/reporting",
dbtable="public.course_mast")


When I run it, it throws error - "No suitable driver found for
jdbc:postgresql". Please help me out.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/NEWBIE-not-able-to-connect-to-postgresql-using-jdbc-tp22569.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to