Have you created sqlContext based on HiveContext?

  val sc = new SparkContext(conf)
  // Create sqlContext based on HiveContext
  val sqlContext = new HiveContext(sc)
  import sqlContext.implicits._

df.registerTempTable("person")
...............






Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com



On 9 May 2016 at 18:33, KhajaAsmath Mohammed <mdkhajaasm...@gmail.com>
wrote:

> Hi,
>
> I have created dataframe with below code and I was able to print the
> schema but unfortuntely cannot pull the data from the temporary table. It
> always says that table is not found
>
>     val df=convertRDDToDF(records, mapper, errorRecords, sparkContext);
>     import sqlContext._
>     df.printSchema()
>     df.registerTempTable("person")
>     val personRecords = sqlContext.sql("select * from person")
>     personRecords.foreach { println }
>
> Schema Output:
> root
>  |-- address: struct (nullable = true)
>  |    |-- city: string (nullable = true)
>  |    |-- line1: string (nullable = true)
>  |    |-- state: string (nullable = true)
>  |    |-- zip: string (nullable = true)
>  |-- first: string (nullable = true)
>  |-- last: string (nullable = true)
>
> *Error while accessing table:*
> Exception in thread "main" org.apache.spark.sql.AnalysisException: Table
> not found: person;
>
> Does anyone have solution for this?
>
> Thanks,
> Asmath
>

Reply via email to