>> and then use the Hive's dynamic partitioned insert syntax
What does this entail? Same sql but you need to do
set hive.exec.dynamic.partition = true;
in the hive/sql context (along with several other related dynamic
partition settings.)
Is there anything else/special require
Thanks Michael
Thanks for the response. Here is my understanding, correct me if I am wrong
1) Spark SQL written partitioned tables do not write metadata to the Hive
metastore. Spark SQL discovers partitions from the table location on the
underlying DFS, and not the metastore. It does this the fir
>
> Is it possible to add a new partition to a persistent table using Spark
> SQL ? The following call works and data gets written in the correct
> directories, but no partition metadata is not added to the Hive metastore.
>
I believe if you use Hive's dynamic partitioned insert syntax then we will
Hi guys
Is it possible to add a new partition to a persistent table using Spark SQL
? The following call works and data gets written in the correct
directories, but no partition metadata is not added to the Hive metastore.
In addition I see nothing preventing any arbitrary schema being appended to