Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19622#discussion_r148219682
  
    --- Diff: 
sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala ---
    @@ -295,7 +297,7 @@ private[spark] class HiveExternalCatalog(conf: 
SparkConf, hadoopConf: Configurat
             storage = table.storage.copy(
               locationUri = None,
               properties = storagePropsWithLocation),
    -        schema = table.partitionSchema,
    +        schema = StructType(EMPTY_DATA_SCHEMA ++ table.partitionSchema),
    --- End diff --
    
    according to https://issues.apache.org/jira/browse/SPARK-19279 , I think 
this new behavior is more reasonable, because empty schema table is problematic 
to hive and thus not Hive-compatible.  cc @gatorsmile 


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to