AngersZhuuuu commented on a change in pull request #31378:
URL: https://github.com/apache/spark/pull/31378#discussion_r571998057
##########
File path: sql/core/src/test/resources/sql-tests/inputs/show-tblproperties.sql
##########
@@ -6,6 +6,13 @@ SHOW TBLPROPERTIES tbl;
SHOW TBLPROPERTIES tbl("p1");
SHOW TBLPROPERTIES tbl("p3");
+set spark.sql.legacy.keepCommandOutputSchema=true;
Review comment:
> Can we follow other PRs and test the config in the scala test?
Updated, should we merge `test("SPARK-34240 Unify output of SHOW
TBLPROPERTIES and pass output attributes properly") ` and `test("show
tblproperties for hive table") `
##########
File path:
sql/core/src/main/scala/org/apache/spark/sql/catalyst/analysis/ResolveSessionCatalog.scala
##########
@@ -481,8 +481,14 @@ class ResolveSessionCatalog(val catalogManager:
CatalogManager)
throw
QueryCompilationErrors.externalCatalogNotSupportShowViewsError(resolved)
}
- case ShowTableProperties(ResolvedV1TableOrViewIdentifier(ident),
propertyKey) =>
- ShowTablePropertiesCommand(ident.asTableIdentifier, propertyKey)
+ case s @ ShowTableProperties(ResolvedV1TableOrViewIdentifier(ident),
propertyKey, output) =>
+ val newOutput = if
(conf.getConf(SQLConf.LEGACY_KEEP_COMMAND_OUTPUT_SCHEMA)) {
+ assert(output.length == 2)
+ output.tail
Review comment:
> we should only do it if key is specified.
Fixed
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]