If you don't need to interact with Hive, you may compile Spark without using the -Phive flag to eliminate Hive dependencies. In this way, the sqlContext instance in Spark shell will be of type SQLContext instead of HiveContext.

The reason behind the Hive metastore error is probably due to Hive misconfiguration.

Cheng

On 9/10/15 6:02 PM, Petr Novak wrote:
Hello,

sqlContext.parquetFile(dir)

throws exception " Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient"

The strange thing is that on the second attempt to open the file it is successful:

try {
    sqlContext.parquetFile(dir)
  } catch {
    case e: Exception => sqlContext.parquetFile(dir)
}

What should I do to make my script to run flawlessly in spark-shell when opening parquetFiles. It is probably missing some dependency. Or how should I write the code because this double attempt is awfull and I don't need HiveMetaStoreClient, I just need to open parquet file.

Many thanks for any idea,
Petr




---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to