This is because that you did not set the parameter "spark.sql.
hive.metastore.version".
You can check other parameters that you have set, it will work well.
Or you can first set this parameter, and then get it.

2015-07-17 11:53 GMT+08:00 RajG <[email protected]>:

> I am using this version of Spark : *spark-1.4.0-bin-hadoop2.6* . I want to
> check few default properties. So I gave the following statement in
> spark-shell
>
> *scala> sqlContext.getConf("spark.sql.hive.metastore.version")
> *I was expecting the call to method getConf to return a value of 0.13.1 as
> desribed in this  link
> <
> http://spark.apache.org/docs/latest/sql-programming-guide.html#interacting-with-different-versions-of-hive-metastore
> >
> . But I got the below exception
>
> *java.util.NoSuchElementException: spark.sql.hive.metastore.version
>     at
> org.apache.spark.sql.SQLConf$$anonfun$getConf$1.apply(SQLConf.scala:283)
>     at
> org.apache.spark.sql.SQLConf$$anonfun$getConf$1.apply(SQLConf.scala:283)
> *Am I retrieving the properties in the right way?
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Retrieving-Spark-Configuration-properties-tp23881.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [email protected]
> For additional commands, e-mail: [email protected]
>
>

Reply via email to