I am using this version of Spark : *spark-1.4.0-bin-hadoop2.6* . I want to
check few default properties. So I gave the following statement in
spark-shell

*scala> sqlContext.getConf("spark.sql.hive.metastore.version")
*I was expecting the call to method getConf to return a value of 0.13.1 as
desribed in this  link
<http://spark.apache.org/docs/latest/sql-programming-guide.html#interacting-with-different-versions-of-hive-metastore>
 
. But I got the below exception

*java.util.NoSuchElementException: spark.sql.hive.metastore.version
    at
org.apache.spark.sql.SQLConf$$anonfun$getConf$1.apply(SQLConf.scala:283)
    at
org.apache.spark.sql.SQLConf$$anonfun$getConf$1.apply(SQLConf.scala:283)
*Am I retrieving the properties in the right way?




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Retrieving-Spark-Configuration-properties-tp23881.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to