Hi,

How to set the hive configurations in Spark 2.1? I have the following in
1.6. How to set the configs related to hive using the new SparkSession?


    sqlContext.sql(s"use ${HIVE_DB_NAME} ")

    sqlContext.setConf("hive.exec.dynamic.partition", "true")
    sqlContext.setConf("hive.exec.dynamic.partition.mode", "nonstrict")
    sqlContext.setConf("hive.exec.max.dynamic.partitions.pernode", "100000")
    sqlContext.setConf("hive.exec.max.dynamic.partitions", "100000")
    sqlContext.setConf("hive.scratch.dir.permission", "777")
    sqlContext.setConf("spark.sql.orc.filterPushdown", "true")
    sqlContext.setConf("spark.sql.shuffle.partitions", "2000")
    sqlContext.setConf("hive.default.fileformat", "Orc")
    sqlContext.setConf("hive.exec.orc.memory.pool", "1.0")
    sqlContext.setConf("hive.optimize.sort.dynamic.partition", "true")
    sqlContext.setConf("hive.exec.reducers.max", "2000")
    sqlContext.setConf("spark.sql.orc.filterPushdown", "true")

    sqlContext.sql("set hive.default.fileformat=Orc  ")
    sqlContext.sql("set hive.enforce.bucketing = true; ")
    sqlContext.sql("set hive.enforce.sorting = true; ")

    sqlContext.sql("set hive.auto.convert.join = true; ")
    sqlContext.sql("set hive.optimize.bucketmapjoin = true; ")
    sqlContext.sql("set hive.optimize.insert.dest.volume=true;")



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-set-hive-configs-in-Spark-2-1-tp28429.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to