imback82 commented on a change in pull request #34255:
URL: https://github.com/apache/spark/pull/34255#discussion_r726802158
##########
File path:
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
##########
@@ -3364,7 +3364,7 @@ object SQLConf {
buildConf("spark.sql.legacy.keepCommandOutputSchema")
.internal()
.doc("When true, Spark will keep the output schema of commands such as
SHOW DATABASES " +
- "unchanged, for v1 catalog and/or table.")
Review comment:
@cloud-fan I am removing the reference to `v1 catalog` since the
`KeepLegacyOutputs` rule runs regardless of catalog versions. Let me know if
this is incorrect, and I need to update `KeepLegacyOutputs` to run only for
session catalog.
##########
File path:
sql/core/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveSessionCatalog.scala
##########
@@ -270,13 +262,7 @@ class ResolveSessionCatalog(val catalogManager:
CatalogManager)
DropDatabaseCommand(db, d.ifExists, d.cascade)
case ShowTables(DatabaseInSessionCatalog(db), pattern, output) if
conf.useV1Command =>
- val newOutput = if
(conf.getConf(SQLConf.LEGACY_KEEP_COMMAND_OUTPUT_SCHEMA)) {
Review comment:
This was already handled in `KeepLegacyOutputs`, so I am removing this
now.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]