I have a variable to be passed to a column of partition as shown below *val broadcastValue = "123456789" * // I assume this will be sent as a constant for the batch // Create a DF on top of XML
df.createOrReplaceTempView("tmp") // Need to create and populate target Parquet table michtest.BroadcastStaging // HiveContext.sql("""DROP TABLE IF EXISTS michtest.BroadcastStaging""") var sqltext = """ CREATE TABLE IF NOT EXISTS michtest.BroadcastStaging ( partyId STRING , phoneNumber STRING ) PARTITIONED BY ( broadcastId STRING , brand STRING) STORED AS PARQUET """ HiveContext.sql(sqltext) // Now insert the data from temp table // // Put data in Hive table // // Dynamic partitioning is disabled by default. We turn it on spark.sql("SET hive.exec.dynamic.partition = true") spark.sql("SET hive.exec.dynamic.partition.mode = nonstrict ") sqltext = """ * $INSERT INTO TABLE michtest.BroadcastStaging PARTITION (broadcastId = $broadcastValue, brand = "dummy")* SELECT ocis_party_id AS partyId , target_mobile_no AS phoneNumber FROM tmp """ spark.sql($sqltext) However, this does not work! scala> sqltext = """ | $INSERT INTO TABLE michtest.BroadcastStaging PARTITION (broadcastId = $broadcastValue, brand = "dummy") | SELECT | ocis_party_id AS partyId | , target_mobile_no AS phoneNumber | FROM tmp | """ sqltext: String = $INSERT INTO TABLE michtest.BroadcastStaging PARTITION (broadcastId = $broadcastValue, brand = "dummy") SELECT ocis_party_id AS partyId , target_mobile_no AS phoneNumber FROM tmp scala> spark.sql($sqltext) <console>:41: error: not found: value $sqltext spark.sql($sqltext) Any ideas? Thanks Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>* http://talebzadehmich.wordpress.com *Disclaimer:* Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction.