ShreelekhyaG commented on a change in pull request #4186:
URL: https://github.com/apache/carbondata/pull/4186#discussion_r677361695
##########
File path:
integration/spark/src/test/scala/org/apache/carbondata/spark/testsuite/deleteTable/TestDeleteTableNewDDL.scala
##########
@@ -58,6 +67,20 @@ class TestDeleteTableNewDDL extends QueryTest with
BeforeAndAfterAll {
}.getMessage.contains("Database 'dropdb_test' not found"))
}
+ test("test drop database when dblocation is inconsistent") {
+ var dbName = "dropdb_test"
+ sql(s"drop database if exists $dbName cascade")
+
CarbonProperties.getInstance().addProperty(CarbonCommonConstants.STORE_LOCATION,
+ warehouse + File.separator + "carbonwarehouse")
+ sql(s"create database $dbName")
Review comment:
> I think, create database itself should not be allowed. Since database
is already created, location is registered to hive. So, this should be handled
while create DB
I agree. Better to create a database in `spark.sql.warehouse.dir` only as`
carbon.storelocation` is deprecated.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]