empcl commented on code in PR #9425:
URL: https://github.com/apache/hudi/pull/9425#discussion_r1305905377


##########
hudi-spark-datasource/hudi-spark-common/src/main/scala/org/apache/hudi/HoodieSparkSqlWriter.scala:
##########
@@ -965,8 +965,9 @@ object HoodieSparkSqlWriter {
     // we must invalidate this table in the cache so writes are reflected in 
later queries
     if (metaSyncEnabled) {
       getHiveTableNames(hoodieConfig).foreach(name => {
-        val qualifiedTableName = String.join(".", 
hoodieConfig.getStringOrDefault(HIVE_DATABASE), name)
-        if (spark.catalog.tableExists(qualifiedTableName)) {
+        val syncDb = hoodieConfig.getStringOrDefault(HIVE_DATABASE)
+        val qualifiedTableName = String.join(".", syncDb, name)
+        if (spark.catalog.databaseExists(syncDb) && 
spark.catalog.tableExists(qualifiedTableName)) {

Review Comment:
   @danny0405 @KnightChess Hello, the invalidate table operation should only be 
executed when enableHiveSupport() is enabled, but sometimes we do not need to 
enable enableHiveSupport(), such as for testing purposes.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to