Github user xwu0226 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16626#discussion_r97213702
  
    --- Diff: 
sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala ---
    @@ -584,14 +593,18 @@ private[spark] class HiveExternalCatalog(conf: 
SparkConf, hadoopConf: Configurat
           // Sets the `schema`, `partitionColumnNames` and `bucketSpec` from 
the old table definition,
           // to retain the spark specific format if it is. Also add old data 
source properties to table
           // properties, to retain the data source table format.
    -      val oldDataSourceProps = 
oldTableDef.properties.filter(_._1.startsWith(DATASOURCE_PREFIX))
    --- End diff --
    
    I think the valuable name needs to change since now the hive table and 
datasource table both populate the table properties with the schema. Both cases 
will go through this path. I temporarily block the datasource table ALTER ADD 
columns because I am not confident yet if I have holes. But according to 
@gatorsmile , it may be safe to support datasource table too. So I am actually 
adding more test cases to confirm. I may remove the condition in this PR. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to