HeartSaVioR commented on a change in pull request #27107: [SPARK-30436][SQL] 
Allow CREATE EXTERNAL TABLE with only requiring LOCATION
URL: https://github.com/apache/spark/pull/27107#discussion_r363587788
 
 

 ##########
 File path: 
sql/core/src/test/scala/org/apache/spark/sql/execution/command/DDLParserSuite.scala
 ##########
 @@ -246,18 +246,17 @@ class DDLParserSuite extends AnalysisTest with 
SharedSparkSession {
 
   test("create hive external table - location must be specified") {
     assertUnsupported(
-      sql = "CREATE EXTERNAL TABLE my_tab STORED AS parquet",
+      sql = "CREATE EXTERNAL TABLE my_tab",
 
 Review comment:
   So the test wasn't have `STORED AS parquet`, and SPARK-30098 added it. (I 
guess it was added because it didn't work without that.) 
   
   This change is just reverting to original one, but it's also OK to have 
another test for having `STORED AS`.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to