Github user viirya commented on a diff in the pull request:
https://github.com/apache/spark/pull/19622#discussion_r148170423
--- Diff:
sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala ---
@@ -295,7 +297,7 @@ private[spark] class HiveExternalCatalog(conf:
SparkConf, hadoopConf: Configurat
storage = table.storage.copy(
locationUri = None,
properties = storagePropsWithLocation),
- schema = table.partitionSchema,
+ schema = StructType(EMPTY_DATA_SCHEMA ++ table.partitionSchema),
--- End diff --
Don't we need to add the empty data schema in
`newHiveCompatibleMetastoreTable`? In existing `HiveClientImpl`, if two
conditions are met `schema.isEmpty` and
`HiveExternalCatalog.isDatasourceTable(table)`, we add the empty schema then.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]