Github user cloud-fan commented on a diff in the pull request:
https://github.com/apache/spark/pull/16944#discussion_r104243616
--- Diff:
sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala ---
@@ -597,6 +597,16 @@ private[spark] class HiveExternalCatalog(conf:
SparkConf, hadoopConf: Configurat
}
}
+ override def alterTableSchema(db: String, table: String, schema:
StructType): Unit = withClient {
+ requireTableExists(db, table)
+ val rawTable = getRawTable(db, table)
+ val withNewSchema = rawTable.copy(schema = schema)
+ // Add table metadata such as table schema, partition columns, etc. to
table properties.
+ val updatedTable = withNewSchema.copy(
+ properties = withNewSchema.properties ++
tableMetaToTableProps(withNewSchema))
+ client.alterTable(updatedTable)
--- End diff --
one more thing, if you look at `def createDataSourceTable`, we need to care
about one special case: when saving table to hive, we should try catch it, if
it fails, convert it to non-hive-compatible format(set schema to Nil) and save
again. We should follow it here, in case the schema we wanna alter to is not
hive compatible.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]