Hi Jay,
Some things to check:
Do you have the following set in your Spark SQL config:
"spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension"
"spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog"
Is the JAR for the package delta-core_2.12:0.7.0 available on bo
Hi George,
You can try mounting a larger PersistentVolume to the work directory as
described here instead of using localdir which might have site-specific size
constraints:
https://spark.apache.org/docs/latest/running-on-kubernetes.html#using-kubernetes-volumes
-Matt
> On Sep 1, 2022, at 09:1