dongjoon-hyun commented on a change in pull request #27107: [SPARK-30436][SQL] 
Allow CREATE EXTERNAL TABLE with only requiring LOCATION
URL: https://github.com/apache/spark/pull/27107#discussion_r363582775
 
 

 ##########
 File path: 
sql/core/src/test/scala/org/apache/spark/sql/execution/command/DDLParserSuite.scala
 ##########
 @@ -246,18 +246,17 @@ class DDLParserSuite extends AnalysisTest with 
SharedSparkSession {
 
   test("create hive external table - location must be specified") {
     assertUnsupported(
-      sql = "CREATE EXTERNAL TABLE my_tab STORED AS parquet",
+      sql = "CREATE EXTERNAL TABLE my_tab",
 
 Review comment:
   Do we have to remove this `STORED AS parquet`? The best backward-compatible 
solution for this test case seems to keep both `STORED AS parquet` and the 
unsupported exception without any changes on this test case.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to