Github user viirya commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19622#discussion_r148168917
  
    --- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/ExternalCatalog.scala
 ---
    @@ -148,17 +148,15 @@ abstract class ExternalCatalog
       def alterTable(tableDefinition: CatalogTable): Unit
     
       /**
    -   * Alter the schema of a table identified by the provided database and 
table name. The new schema
    -   * should still contain the existing bucket columns and partition 
columns used by the table. This
    -   * method will also update any Spark SQL-related parameters stored as 
Hive table properties (such
    -   * as the schema itself).
    +   * Alter the data schema of a table identified by the provided database 
and table name. The new
    +   * data schema should not have conflict column names with the existing 
partition columns, and
    +   * should still contain all the existing data columns.
        *
        * @param db Database that table to alter schema for exists in
        * @param table Name of table to alter schema for
    -   * @param schema Updated schema to be used for the table (must contain 
existing partition and
    -   *               bucket columns)
    +   * @param newDataSchema Updated data schema to be used for the table.
        */
    -  def alterTableSchema(db: String, table: String, schema: StructType): Unit
    +  def alterTableDataSchema(db: String, table: String, newDataSchema: 
StructType): Unit
    --- End diff --
    
    Looks like we don't support dropping columns yet. New description is more 
clear.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to