This is an automated email from the ASF dual-hosted git repository.

wenchen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


    from 3469ec6b4196 [SPARK-48656][CORE] Do a length check and throw 
COLLECTION_SIZE_LIMIT_EXCEEDED error in `CartesianRDD.getPartitions`
     add cd8bf110b7f5 [SPARK-48659][SQL][TESTS] Unify v1 and v2 ALTER TABLE .. 
SET TBLPROPERTIES tests

No new revisions were added by this update.

Summary of changes:
 .../spark/sql/catalyst/parser/DDLParserSuite.scala |  8 ---
 ...=> AlterTableSetTblPropertiesParserSuite.scala} | 40 +++++++------
 .../AlterTableSetTblPropertiesSuiteBase.scala      | 69 ++++++++++++++++++++++
 .../sql/execution/command/DDLParserSuite.scala     | 12 ----
 .../spark/sql/execution/command/DDLSuite.scala     | 38 ------------
 .../v1/AlterTableSetTblPropertiesSuite.scala       | 62 +++++++++++++++++++
 ...scala => AlterTableSetTblPropertiesSuite.scala} | 46 +++++----------
 .../spark/sql/hive/execution/HiveDDLSuite.scala    |  4 --
 ...scala => AlterTableSetTblPropertiesSuite.scala} |  5 +-
 9 files changed, 169 insertions(+), 115 deletions(-)
 copy 
sql/core/src/test/scala/org/apache/spark/sql/execution/command/{AlterNamespaceSetPropertiesParserSuite.scala
 => AlterTableSetTblPropertiesParserSuite.scala} (54%)
 create mode 100644 
sql/core/src/test/scala/org/apache/spark/sql/execution/command/AlterTableSetTblPropertiesSuiteBase.scala
 create mode 100644 
sql/core/src/test/scala/org/apache/spark/sql/execution/command/v1/AlterTableSetTblPropertiesSuite.scala
 copy 
sql/core/src/test/scala/org/apache/spark/sql/execution/command/v2/{AlterTableSetLocationSuite.scala
 => AlterTableSetTblPropertiesSuite.scala} (51%)
 copy 
sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/command/{TruncateTableSuite.scala
 => AlterTableSetTblPropertiesSuite.scala} (82%)


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to