Github user cloud-fan commented on a diff in the pull request:
https://github.com/apache/spark/pull/15900#discussion_r88383221
--- Diff:
sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala ---
@@ -1023,6 +1023,11 @@ object HiveExternalCatalog {
// After SPARK-6024, we removed this flag.
// Although we are not using `spark.sql.sources.schema` any more, we
need to still support.
DataType.fromJson(schema.get).asInstanceOf[StructType]
+ } else if
(props.filterKeys(_.startsWith(DATASOURCE_SCHEMA_PREFIX)).isEmpty) {
+ // If there is no schema information in table properties, it means
the schema of this table
+ // was empty when saving into metastore, which is possible in older
version of Spark. We
+ // should respect it.
+ new StructType()
--- End diff --
no, since we also store schema for hive table, hive table will also call
this function. But hive table will never go into this branch, as it always has
a schema.(the removal of runtime schema inference happened before we store
schema of hive table)
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]