Hi, Setting the hive configuration to *spark.hive.xxx* should take effect. Could you share your configuration? https://github.com/apache/spark/blob/648457905c4ea7d00e3d88048c63f360045f0714/core/src/main/scala/org/apache/spark/deploy/SparkHadoopUtil.scala#L108-L115
And metastore client initialization code is here https://github.com/apache/spark/blob/v3.0.2/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala Ankur Khanna <ankur.kha...@oracle.com> 于2022年7月4日周一 20:50写道: > Hi, > > > > I am using spark-shell/spark-sql with Hive Metastore (running as a > standalone process). I am facing a problem where the custom conf I pass > while starting spark-shell (using –conf) is not being passed-on to the > metastore with the session. > > I’ll appreciate help on how to get the properties passed-on to the > metastore when starting a spark session. > > > > Also, it will be super-helpful if someone can point me to the code where > the metastore client gets initialized when I begin a new spark-shell > session. (My assumption is that hiveConf should be passed while > initializing the metastore client instance when a new session is started) > > > > Spark version : 3.0.2 > > Hive version : 3.1.2 > > > > Best, > > Ankur Khanna > -- Best! Qian SUN