Looks like a classpath problem, if you can provide the command you used to run 
your application and environment variable SPARK_HOME, it will help others to 
identify the root problem



在2015年11月20日 18:59,Satish 写道:
Hi Michael,
As my current Spark version is 1.4.0 than why it error out as "error: not 
found: value sqlContext" when I have "import sqlContext.implicits._" in my 
Spark Job

Regards
Satish Chandra
From: Michael Armbrust
Sent: ‎20-‎11-‎2015 01:36
To: satish chandra j
Cc: user; hari krishna
Subject: Re: Error not found value sqlContext


http://spark.apache.org/docs/latest/sql-programming-guide.html#upgrading-from-spark-sql-10-12-to-13



On Thu, Nov 19, 2015 at 4:19 AM, satish chandra j <jsatishchan...@gmail.com> 
wrote:

HI All,
we have recently migrated from Spark 1.2.1 to Spark 1.4.0, I am fetching data 
from an RDBMS using JDBCRDD and register it as temp table to perform SQL query


Below approach is working fine in Spark 1.2.1:


JDBCRDD --> apply map using Case Class --> apply createSchemaRDD --> 
registerTempTable --> perform SQL Query


but now as createSchemaRDD is not supported in Spark 1.4.0



JDBCRDD --> apply map using Case Class with .toDF() --> registerTempTable --> 
perform SQL query on temptable




JDBCRDD --> apply map using Case Class --> RDD.toDF().registerTempTable --> 
perform SQL query on temptable



Only solution I get everywhere is to  use "import sqlContext.implicits._" after 
val SQLContext = new org.apache.spark.sql.SQLContext(sc)


But it errors with the two generic errors


1. error: not found: value sqlContext


2. value toDF is not a member of org.apache.spark.rdd.RDD











Reply via email to