yaooqinn commented on a change in pull request #32563:
URL: https://github.com/apache/spark/pull/32563#discussion_r638889940



##########
File path: 
sql/core/src/main/scala/org/apache/spark/sql/execution/command/tables.scala
##########
@@ -869,8 +874,12 @@ case class ShowTablesCommand(
       val database = tableIdent.database.getOrElse("")
       val tableName = tableIdent.table
       val isTemp = catalog.isTempView(tableIdent)
-      val information = partition.simpleString
-      Seq(Row(database, tableName, isTemp, s"$information\n"))
+      val infoValue = if 
(conf.getConf(SQLConf.LEGACY_KEEP_COMMAND_OUTPUT_SCHEMA)) {

Review comment:
       seems worthy of having `def legacyKeepCommandOutputSchema: Boolean = 
getConf(LEGACY_KEEP_COMMAND_OUTPUT_SCHEMA)`

##########
File path: docs/sql-migration-guide.md
##########
@@ -48,8 +48,8 @@ license: |
   - In Spark 3.2, the auto-generated `Cast` (such as those added by type 
coercion rules) will be stripped when generating column alias names. E.g., 
`sql("SELECT floor(1)").columns` will be `FLOOR(1)` instead of `FLOOR(CAST(1 AS 
DOUBLE))`.
   
   - In Spark 3.2, the output schema of `SHOW TABLES` becomes `namespace: 
string, tableName: string, isTemporary: boolean`. In Spark 3.1 or earlier, the 
`namespace` field was named `database` for the builtin catalog, and there is no 
`isTemporary` field for v2 catalogs. To restore the old schema with the builtin 
catalog, you can set `spark.sql.legacy.keepCommandOutputSchema` to `true`.
-  
-  - In Spark 3.2, the output schema of `SHOW TABLE EXTENDED` becomes 
`namespace: string, tableName: string, isTemporary: boolean, information: 
string`. In Spark 3.1 or earlier, the `namespace` field was named `database` 
for the builtin catalog, and no change for the v2 catalogs. To restore the old 
schema with the builtin catalog, you can set 
`spark.sql.legacy.keepCommandOutputSchema` to `true`.
+
+  - In Spark 3.2, the output schema of `SHOW TABLE EXTENDED` becomes 
`namespace: string, tableName: string, isTemporary: boolean, information: 
map<string, string>`. In Spark 3.1 or earlier, the `namespace` field was named 
`database` for the builtin catalog, and no change for the v2 catalogs. In Spark 
3.1 or earlier, the `information` field was string type. To restore the old 
schema, you can set `spark.sql.legacy.keepCommandOutputSchema` to `true`.

Review comment:
       not related to this PR, does `legacy.keep` sounds redundant




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to