Sorry, just to understand my issue.....if Eclipse could not understand
Scala syntax properly than it should error for the other Spark Job which is
fetching data from a RDBM and printing the output in console. I think there
is some dependencies which are missing due to which "import
sqlContext.implicits._" is not recognized during compile time

Please let me know if any further inputs needed to fix the same

Regards,
Satish Chandra

On Mon, Nov 23, 2015 at 3:29 PM, prosp4300 <prosp4...@163.com> wrote:

>
>
> So it is actually a compile time error in Eclipse, instead of jar
> generation from Eclipse, you can try to use sbt to assembly your jar, looks
> like your Eclipse does not recognize the Scala syntax properly.
>
>
>
> At 2015-11-20 21:36:55, "satish chandra j" <jsatishchan...@gmail.com>
> wrote:
>
> HI All,
> I am getting this error while generating executable Jar file itself in
> Eclipse, if the Spark Application code has "import sqlContext.implicits._"
> line in it. Spark Applicaiton code  works fine if the above mentioned line
> does not exist as I have tested by fetching data from an RDBMS by
> implementing JDBCRDD
>
> I tried couple of DataFrame related methods for which most of them errors
> stating that method has been overloaded
>
> Please let me know if any further inputs needed to analyze it
>
> Regards,
> Satish Chandra
>
> On Fri, Nov 20, 2015 at 5:46 PM, prosp4300 <prosp4...@163.com> wrote:
>
>>
>> Looks like a classpath problem, if you can provide the command you used
>> to run your application and environment variable SPARK_HOME, it will help
>> others to identify the root problem
>>
>>
>> 在2015年11月20日 18:59,Satish <jsatishchan...@gmail.com> 写道:
>>
>> Hi Michael,
>> As my current Spark version is 1.4.0 than why it error out as "error: not
>> found: value sqlContext" when I have "import sqlContext.implicits._" in my
>> Spark Job
>>
>> Regards
>> Satish Chandra
>> ------------------------------
>> From: Michael Armbrust <mich...@databricks.com>
>> Sent: ‎20-‎11-‎2015 01:36
>> To: satish chandra j <jsatishchan...@gmail.com>
>> Cc: user <user@spark.apache.org>; hari krishna <harikrishn...@gmail.com>
>> Subject: Re: Error not found value sqlContext
>>
>>
>> http://spark.apache.org/docs/latest/sql-programming-guide.html#upgrading-from-spark-sql-10-12-to-13
>>
>> On Thu, Nov 19, 2015 at 4:19 AM, satish chandra j <
>> jsatishchan...@gmail.com> wrote:
>>
>>> HI All,
>>> we have recently migrated from Spark 1.2.1 to Spark 1.4.0, I am fetching
>>> data from an RDBMS using JDBCRDD and register it as temp table to perform
>>> SQL query
>>>
>>> Below approach is working fine in Spark 1.2.1:
>>>
>>> JDBCRDD --> apply map using Case Class --> apply createSchemaRDD -->
>>> registerTempTable --> perform SQL Query
>>>
>>> but now as createSchemaRDD is not supported in Spark 1.4.0
>>>
>>> JDBCRDD --> apply map using Case Class with* .toDF()* -->
>>> registerTempTable --> perform SQL query on temptable
>>>
>>>
>>> JDBCRDD --> apply map using Case Class --> RDD*.toDF()*.registerTempTable
>>> --> perform SQL query on temptable
>>>
>>> Only solution I get everywhere is to  use "import
>>> sqlContext.implicits._" after val SQLContext = new
>>> org.apache.spark.sql.SQLContext(sc)
>>>
>>> But it errors with the two generic errors
>>>
>>> *1. error: not found: value sqlContext*
>>>
>>> *2. value toDF is not a member of org.apache.spark.rdd.RDD*
>>>
>>>
>>>
>>>
>>>
>>>
>>
>>
>>
>
>
>
>

Reply via email to