HeartSaVioR commented on a change in pull request #27107: [SPARK-30436][SQL] 
Allow CREATE EXTERNAL TABLE with only requiring LOCATION
URL: https://github.com/apache/spark/pull/27107#discussion_r363281172
 
 

 ##########
 File path: 
sql/core/src/test/scala/org/apache/spark/sql/execution/command/DDLParserSuite.scala
 ##########
 @@ -246,18 +246,17 @@ class DDLParserSuite extends AnalysisTest with 
SharedSparkSession {
 
   test("create hive external table - location must be specified") {
     assertUnsupported(
-      sql = "CREATE EXTERNAL TABLE my_tab STORED AS parquet",
+      sql = "CREATE EXTERNAL TABLE my_tab",
 
 Review comment:
   This clearly represents the change: `STORED AS parquet` was required to let 
the query statement fall into createHiveTable which has been optional. We no 
longer require it.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to