I started spark-shell with below command:

spark-shell --master yarn --conf spark.sql.warehouse.dir="/user/spark"

In spark-shell, below statement can create a managed table using
/user/spark HDFS folder:

spark.sql("CREATE TABLE t5 (i int) USING PARQUET")

However, below statements still use spark-warehouse in local folder like
{currentfolder}/spark-warehouse.

case class SymbolInfo(name: String, sector: String)

val siDS = Seq(
  SymbolInfo("AAPL", "IT"),
  SymbolInfo("GOOG", "IT")
).toDS()

siDS.write.saveAsTable("siDS")

How can I make saveAsTable respect spark.sql.warehouse.dir when creating a
managed table? Appreciate any help!

Reply via email to