empcl commented on code in PR #9425:
URL: https://github.com/apache/hudi/pull/9425#discussion_r1295516189


##########
hudi-spark-datasource/hudi-spark-common/src/main/scala/org/apache/hudi/HoodieSparkSqlWriter.scala:
##########
@@ -965,8 +965,9 @@ object HoodieSparkSqlWriter {
     // we must invalidate this table in the cache so writes are reflected in 
later queries
     if (metaSyncEnabled) {
       getHiveTableNames(hoodieConfig).foreach(name => {
-        val qualifiedTableName = String.join(".", 
hoodieConfig.getStringOrDefault(HIVE_DATABASE), name)
-        if (spark.catalog.tableExists(qualifiedTableName)) {
+        val syncDb = hoodieConfig.getStringOrDefault(HIVE_DATABASE)
+        val qualifiedTableName = String.join(".", syncDb, name)

Review Comment:
   Hello, let me first talk about the current background. When I use the spark 
sync hive function, if the spark version is 3.1 or below, there is a problem 
with the database when performing the validate sync hive table operation. After 
reviewing the source code, it was found that Spark 3.1 tableExists needs to 
verify whether the database exists.
   
   `protected def requireDbExists(db: String): Unit = {
       if (!databaseExists(db)) {
         throw new NoSuchDatabaseException(db)
       }
     }`



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to