empcl commented on code in PR #9425:
URL: https://github.com/apache/hudi/pull/9425#discussion_r1296007445


##########
hudi-spark-datasource/hudi-spark-common/src/main/scala/org/apache/hudi/HoodieSparkSqlWriter.scala:
##########
@@ -965,8 +965,9 @@ object HoodieSparkSqlWriter {
     // we must invalidate this table in the cache so writes are reflected in 
later queries
     if (metaSyncEnabled) {
       getHiveTableNames(hoodieConfig).foreach(name => {
-        val qualifiedTableName = String.join(".", 
hoodieConfig.getStringOrDefault(HIVE_DATABASE), name)
-        if (spark.catalog.tableExists(qualifiedTableName)) {
+        val syncDb = hoodieConfig.getStringOrDefault(HIVE_DATABASE)
+        val qualifiedTableName = String.join(".", syncDb, name)
+        if (spark.catalog.databaseExists(syncDb) && 
spark.catalog.tableExists(qualifiedTableName)) {

Review Comment:
   Hello, in Spark 3.1 and earlier versions, when detecting the existence of a 
table, it is mandatory for the database to exist. However, here, the database 
is not registered in the catalog in advance



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to