Are you certain you are providing Spark with the right Hive configuration?
Is there a valid HIVE_CONF_DIR defined in your spark-env.sh, with a
hive-site.xml detailing the location/etc. of the metastore service and/or
DB?
Without a valid metastore config, Hive may switch to using a local
(embedded)
Hi,
I am new bee to Spark and using HDP 2.2 which comes with Spark 1.3.1
I tried following code example
> import org.apache.spark.sql.SQLContext
> val sqlContext = new org.apache.spark.sql.SQLContext(sc)
> import sqlContext.implicits._
>
> val personFile = "/user/hdfs/TestSpark/Person.csv"
> val