maropu commented on a change in pull request #31378:
URL: https://github.com/apache/spark/pull/31378#discussion_r568376839



##########
File path: 
sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveCommandSuite.scala
##########
@@ -129,7 +129,7 @@ class HiveCommandSuite extends QueryTest with SQLTestUtils 
with TestHiveSingleto
 
     checkAnswer(
       sql(s"SHOW TBLPROPERTIES parquet_tab1('my_key1')"),
-      Row("v1") :: Nil
+      Row("my_key1", "v1") :: Nil

Review comment:
       Changing the output schema looks fine to me. Could you describe this 
change in the PR description?

##########
File path: docs/sql-migration-guide.md
##########
@@ -56,6 +56,8 @@ license: |
   In Spark 3.1 and earlier, table refreshing leaves dependents uncached.
 
   - In Spark 3.2, the usage of `count(tblName.*)` is blocked to avoid 
producing ambiguous results. Because `count(*)` and `count(tblName.*)` will 
output differently if there is any null values. To restore the behavior before 
Spark 3.2, you can set 
`spark.sql.legacy.allowStarWithSingleTableIdentifierInCount` to `true`.
+ 
+  - In Spark 3.2, `ShowTablePropertiesCommand`'s output both show `key` and 
`value` columns whether `propertyKey` is defined. In Spark 3.1 and earlier, 
`ShowTablePropertiesCommand`'s output only show `value` column when 
`propertyKey` is defined.

Review comment:
       `ShowTablePropertiesCommand` is an internal class name, so please use "a 
 `SHOW TBLPROPERTIES` clause` instead.

##########
File path: docs/sql-migration-guide.md
##########
@@ -56,6 +56,8 @@ license: |
   In Spark 3.1 and earlier, table refreshing leaves dependents uncached.
 
   - In Spark 3.2, the usage of `count(tblName.*)` is blocked to avoid 
producing ambiguous results. Because `count(*)` and `count(tblName.*)` will 
output differently if there is any null values. To restore the behavior before 
Spark 3.2, you can set 
`spark.sql.legacy.allowStarWithSingleTableIdentifierInCount` to `true`.
+ 
+  - In Spark 3.2, `ShowTablePropertiesCommand`'s output both show `key` and 
`value` columns whether `propertyKey` is defined. In Spark 3.1 and earlier, 
`ShowTablePropertiesCommand`'s output only show `value` column when 
`propertyKey` is defined.

Review comment:
       "only show `value` column" -> "only show a `value` column"

##########
File path: docs/sql-migration-guide.md
##########
@@ -56,6 +56,8 @@ license: |
   In Spark 3.1 and earlier, table refreshing leaves dependents uncached.
 
   - In Spark 3.2, the usage of `count(tblName.*)` is blocked to avoid 
producing ambiguous results. Because `count(*)` and `count(tblName.*)` will 
output differently if there is any null values. To restore the behavior before 
Spark 3.2, you can set 
`spark.sql.legacy.allowStarWithSingleTableIdentifierInCount` to `true`.
+ 
+  - In Spark 3.2, `ShowTablePropertiesCommand`'s output both show `key` and 
`value` columns whether `propertyKey` is defined. In Spark 3.1 and earlier, 
`ShowTablePropertiesCommand`'s output only show `value` column when 
`propertyKey` is defined.

Review comment:
       nit: "only show `value` column" -> "only show a `value` column"

##########
File path: docs/sql-migration-guide.md
##########
@@ -56,6 +56,8 @@ license: |
   In Spark 3.1 and earlier, table refreshing leaves dependents uncached.
 
   - In Spark 3.2, the usage of `count(tblName.*)` is blocked to avoid 
producing ambiguous results. Because `count(*)` and `count(tblName.*)` will 
output differently if there is any null values. To restore the behavior before 
Spark 3.2, you can set 
`spark.sql.legacy.allowStarWithSingleTableIdentifierInCount` to `true`.
+ 
+  - In Spark 3.2, a `SHOW TBLPROPERTIES` clause's output both show `key` and 
`value` columns whether `propertyKey` is defined. In Spark 3.1 and earlier, a 
`SHOW TBLPROPERTIES` clause's output only show a `value` column when 
`propertyKey` is defined.

Review comment:
       `propertyKey` is also an internal variable name?

##########
File path: docs/sql-migration-guide.md
##########
@@ -56,6 +56,8 @@ license: |
   In Spark 3.1 and earlier, table refreshing leaves dependents uncached.
 
   - In Spark 3.2, the usage of `count(tblName.*)` is blocked to avoid 
producing ambiguous results. Because `count(*)` and `count(tblName.*)` will 
output differently if there is any null values. To restore the behavior before 
Spark 3.2, you can set 
`spark.sql.legacy.allowStarWithSingleTableIdentifierInCount` to `true`.
+ 
+  - In Spark 3.2, a `SHOW TBLPROPERTIES` clause's output both show `key` and 
`value` columns whether `propertyKey` is defined. In Spark 3.1 and earlier, a 
`SHOW TBLPROPERTIES` clause's output only show a `value` column when 
`propertyKey` is defined.

Review comment:
       "output both show `key` and `value` columns" => "output shows `key` and 
`value` columns"




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to