Hi Gaurav,
You can try something like this.
SparkConf conf = new SparkConf();
JavaSparkContext sc = new JavaSparkContext(conf);
SQLContext sqlContext = new org.apache.spark.sql.SQLContext(sc);
Class.forName("com.mysql.jdbc.Driver");
String url="url";
Properties prop = new java.util.Properties();
prop.setProperty("user","user");
prop.setProperty("password","password");
DataFrame tableA = sqlContext.read().jdbc(url,"tableA",prop);
DataFrame tableB = sqlContext.read().jdbc(url,"tableB",prop);
Hope this helps.
Thanks,
Prashant
From: Gaurav Agarwal [mailto:gaurav130...@gmail.com]
Sent: Thursday, February 11, 2016 7:35 PM
To: user@spark.apache.org
Subject: Dataframes
Hi
Can we load 5 data frame for 5 tables in one spark context.
I am asking why because we have to give
Map options= new hashmap();
Options.put(driver,"");
Options.put(URL,"");
Options.put(dbtable,"");
I can give only table query at time in dbtable options .
How will I register multiple queries and dataframes
Thankw
with all table.
Thanks
+