zyl891229 opened a new issue, #10170:
URL: https://github.com/apache/hudi/issues/10170

   **Describe the problem you faced**
   
   Error reported when overwrite and synchronize hms
   
   **To Reproduce**
   
   Steps to reproduce the behavior:
   
   1. use spark dataframe   , configure the synchronous hms
   
       df.write.format("hudi")
         .option(DataSourceWriteOptions.TABLE_TYPE.key(), 
COW_TABLE_TYPE_OPT_VAL)
         .options(getQuickstartWriteConfigs)
         .option(PRECOMBINE_FIELD_OPT_KEY,    "name")
         .option(RECORDKEY_FIELD_OPT_KEY,     "name")
         .option(PARTITIONPATH_FIELD_OPT_KEY, "city")
         .option(HoodieWriteConfig.TBL_NAME.key(), tableName)
   
         .option(DataSourceWriteOptions.OPERATION.key(), 
DataSourceWriteOptions.BULK_INSERT_OPERATION_OPT_VAL)
         
.option("hoodie.bulkinsert.overwrite.operation.type","insert_overwrite")
   
         .option(DataSourceWriteOptions.HIVE_STYLE_PARTITIONING.key(), "true")
         .option(HoodieTableConfig.POPULATE_META_FIELDS.key(), "false")
         .option(HoodieMetadataConfig.ENABLE.key(), "true")
         .option("hoodie.metadata.index.column.stats.enable", "true")
   
         .option("hoodie.datasource.hive_sync.enable","true")
         .option("hoodie.datasource.hive_sync.mode","hms")
         
.option("hoodie.datasource.hive_sync.metastore.uris","thrift://127.0.0.1:9083")
         .mode(SaveMode.Append)
         .save(basePath)
   3.
   When the second overwrite is written, the following error is reported during 
the synchronization logic
   According to the hms service log, database is an empty string, 
   But according to the source code default value should be ”default“
   
   <img width="922" alt="image" 
src="https://github.com/apache/hudi/assets/13060417/e699403b-e3a7-4909-bf0c-f34032a7ff7f";>
   
   <img width="899" alt="image" 
src="https://github.com/apache/hudi/assets/13060417/ef622afb-baf8-4c0a-a203-624e0629f1f7";>
   
    
   
   **Expected behavior**
   
   The default value for hoodie.datasource.hive_sync.database is ”default“ not 
empty string
   
   **Environment Description**
   
   * Hudi version :
   * 0.14.0
   
   * Spark version :
   * 3.1.1
   
   * Storage (HDFS/S3/GCS..) :
   * OSS
   
   * Running on Docker? (yes/no) :
   * no
   
   **Additional context**
   
   Add any other context about the problem here.
   
   **Stacktrace**
   
   spark log 
   2023-11-21 21:25:37,195 ERROR org.apache.spark.deploy.yarn.ApplicationMaster 
              - User class threw exception: 
org.apache.hudi.exception.HoodieMetaSyncException: Could not sync using the 
meta sync class org.apache.hudi.hive.HiveSyncTool
   org.apache.hudi.exception.HoodieMetaSyncException: Could not sync using the 
meta sync class org.apache.hudi.hive.HiveSyncTool
        at 
org.apache.hudi.sync.common.util.SyncUtilHelpers.runHoodieMetaSync(SyncUtilHelpers.java:81)
 ~[_f1c60536b59a86c4fb55c4ab4a94cd56.jar:?]
        at 
org.apache.hudi.HoodieSparkSqlWriter$.$anonfun$metaSync$2(HoodieSparkSqlWriter.scala:993)
 ~[_f1c60536b59a86c4fb55c4ab4a94cd56.jar:?]
        at scala.collection.mutable.HashSet.foreach(HashSet.scala:79) 
~[scala-library-2.12.10.jar:?]
        at 
org.apache.hudi.HoodieSparkSqlWriter$.metaSync(HoodieSparkSqlWriter.scala:991) 
~[_f1c60536b59a86c4fb55c4ab4a94cd56.jar:?]
        at 
org.apache.hudi.HoodieSparkSqlWriter$.commitAndPerformPostOperations(HoodieSparkSqlWriter.scala:1089)
 ~[_f1c60536b59a86c4fb55c4ab4a94cd56.jar:?]
        at 
org.apache.hudi.HoodieSparkSqlWriter$.bulkInsertAsRow(HoodieSparkSqlWriter.scala:920)
 ~[_f1c60536b59a86c4fb55c4ab4a94cd56.jar:?]
        at 
org.apache.hudi.HoodieSparkSqlWriter$.writeInternal(HoodieSparkSqlWriter.scala:409)
 ~[_f1c60536b59a86c4fb55c4ab4a94cd56.jar:?]
        at 
org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:132) 
~[_f1c60536b59a86c4fb55c4ab4a94cd56.jar:?]
        at 
org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:150) 
~[_f1c60536b59a86c4fb55c4ab4a94cd56.jar:?]
        at 
org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:46)
 ~[spark-sql_2.12-3.1.1.jar:3.1.1]
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
 ~[spark-sql_2.12-3.1.1.jar:3.1.1]
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
 ~[spark-sql_2.12-3.1.1.jar:3.1.1]
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:90)
 ~[spark-sql_2.12-3.1.1.jar:3.1.1]
        at 
org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:180)
 ~[spark-sql_2.12-3.1.1.jar:3.1.1]
        at 
org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:218)
 ~[spark-sql_2.12-3.1.1.jar:3.1.1]
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) 
~[spark-core_2.12-3.1.1.jar:3.1.1]
        at 
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:215) 
~[spark-sql_2.12-3.1.1.jar:3.1.1]
        at 
org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:176) 
~[spark-sql_2.12-3.1.1.jar:3.1.1]
        at 
org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:132)
 ~[spark-sql_2.12-3.1.1.jar:3.1.1]
        at 
org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:131) 
~[spark-sql_2.12-3.1.1.jar:3.1.1]
        at 
org.apache.spark.sql.DataFrameWriter.$anonfun$runCommand$1(DataFrameWriter.scala:989)
 ~[spark-sql_2.12-3.1.1.jar:3.1.1]
        at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
 ~[spark-sql_2.12-3.1.1.jar:3.1.1]
        at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
 ~[spark-sql_2.12-3.1.1.jar:3.1.1]
        at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
 ~[spark-sql_2.12-3.1.1.jar:3.1.1]
        at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772) 
~[spark-sql_2.12-3.1.1.jar:3.1.1]
        at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
 ~[spark-sql_2.12-3.1.1.jar:3.1.1]
        at 
org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:989) 
~[spark-sql_2.12-3.1.1.jar:3.1.1]
        at 
org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:438) 
~[spark-sql_2.12-3.1.1.jar:3.1.1]
        at 
org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:415) 
~[spark-sql_2.12-3.1.1.jar:3.1.1]
        at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:293) 
~[spark-sql_2.12-3.1.1.jar:3.1.1]
        at 
com.aliyun.odps.spark.examples.hudi.SparkHudiJson$.main(SparkHudiJson.scala:253)
 ~[_f1c60536b59a86c4fb55c4ab4a94cd56.jar:?]
        at 
com.aliyun.odps.spark.examples.hudi.SparkHudiJson.main(SparkHudiJson.scala) 
~[_f1c60536b59a86c4fb55c4ab4a94cd56.jar:?]
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
~[?:1.8.0_65-AliJVM]
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
~[?:1.8.0_65-AliJVM]
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 ~[?:1.8.0_65-AliJVM]
        at java.lang.reflect.Method.invoke(Method.java:497) ~[?:1.8.0_65-AliJVM]
        at 
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:732)
 [spark-yarn_2.12-3.1.1.jar:3.1.1]
   Caused by: org.apache.hudi.exception.HoodieException: Got runtime exception 
when hive syncing 
hudi_dwd_aplus_pub_log_event_1d_tenant1_project47532305069060_v50_extend_odps_2201_12000_bulkoverwrite
        at 
org.apache.hudi.hive.HiveSyncTool.syncHoodieTable(HiveSyncTool.java:168) 
~[_f1c60536b59a86c4fb55c4ab4a94cd56.jar:?]
        at 
org.apache.hudi.sync.common.util.SyncUtilHelpers.runHoodieMetaSync(SyncUtilHelpers.java:79)
 ~[_f1c60536b59a86c4fb55c4ab4a94cd56.jar:?]
        ... 36 more
   Caused by: org.apache.hudi.hive.HoodieHiveSyncException: failed to create 
table 
hudi_dwd_aplus_pub_log_event_1d_tenant1_project47532305069060_v50_extend_odps_2201_12000_bulkoverwrite
        at 
org.apache.hudi.hive.ddl.HMSDDLExecutor.createTable(HMSDDLExecutor.java:140) 
~[_f1c60536b59a86c4fb55c4ab4a94cd56.jar:?]
        at 
org.apache.hudi.hive.HoodieHiveSyncClient.createTable(HoodieHiveSyncClient.java:235)
 ~[_f1c60536b59a86c4fb55c4ab4a94cd56.jar:?]
        at 
org.apache.hudi.hive.HiveSyncTool.syncFirstTime(HiveSyncTool.java:329) 
~[_f1c60536b59a86c4fb55c4ab4a94cd56.jar:?]
        at 
org.apache.hudi.hive.HiveSyncTool.syncHoodieTable(HiveSyncTool.java:251) 
~[_f1c60536b59a86c4fb55c4ab4a94cd56.jar:?]
        at org.apache.hudi.hive.HiveSyncTool.doSync(HiveSyncTool.java:177) 
~[_f1c60536b59a86c4fb55c4ab4a94cd56.jar:?]
        at 
org.apache.hudi.hive.HiveSyncTool.syncHoodieTable(HiveSyncTool.java:165) 
~[_f1c60536b59a86c4fb55c4ab4a94cd56.jar:?]
        at 
org.apache.hudi.sync.common.util.SyncUtilHelpers.runHoodieMetaSync(SyncUtilHelpers.java:79)
 ~[_f1c60536b59a86c4fb55c4ab4a94cd56.jar:?]
        ... 36 more
   Caused by: org.apache.hadoop.hive.metastore.api.InvalidObjectException: 
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_result$create_table_with_environment_context_resultStandardScheme.read(ThriftHiveMetastore.java:42216)
 ~[hive-metastore-2.3.7.jar:2.3.7]
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_result$create_table_with_environment_context_resultStandardScheme.read(ThriftHiveMetastore.java:42193)
 ~[hive-metastore-2.3.7.jar:2.3.7]
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_result.read(ThriftHiveMetastore.java:42119)
 ~[hive-metastore-2.3.7.jar:2.3.7]
        at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:88) 
~[_f1c60536b59a86c4fb55c4ab4a94cd56.jar:?]
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_create_table_with_environment_context(ThriftHiveMetastore.java:1203)
 ~[hive-metastore-2.3.7.jar:2.3.7]
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.create_table_with_environment_context(ThriftHiveMetastore.java:1189)
 ~[hive-metastore-2.3.7.jar:2.3.7]
        at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.create_table_with_environment_context(HiveMetaStoreClient.java:2405)
 ~[hive-metastore-2.3.7.jar:2.3.7]
        at 
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.create_table_with_environment_context(SessionHiveMetaStoreClient.java:93)
 ~[hive-exec-2.3.7-core.jar:2.3.7]
        at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:752)
 ~[hive-metastore-2.3.7.jar:2.3.7]
        at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:740)
 ~[hive-metastore-2.3.7.jar:2.3.7]
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
~[?:1.8.0_65-AliJVM]
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
~[?:1.8.0_65-AliJVM]
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 ~[?:1.8.0_65-AliJVM]
        at java.lang.reflect.Method.invoke(Method.java:497) ~[?:1.8.0_65-AliJVM]
        at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:173)
 ~[hive-metastore-2.3.7.jar:2.3.7]
        at com.sun.proxy.$Proxy73.createTable(Unknown Source) ~[?:?]
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
~[?:1.8.0_65-AliJVM]
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
~[?:1.8.0_65-AliJVM]
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 ~[?:1.8.0_65-AliJVM]
        at java.lang.reflect.Method.invoke(Method.java:497) ~[?:1.8.0_65-AliJVM]
        at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient$SynchronizedHandler.invoke(HiveMetaStoreClient.java:2336)
 ~[hive-metastore-2.3.7.jar:2.3.7]
        at com.sun.proxy.$Proxy73.createTable(Unknown Source) ~[?:?]
        at 
org.apache.hudi.hive.ddl.HMSDDLExecutor.createTable(HMSDDLExecutor.java:137) 
~[_f1c60536b59a86c4fb55c4ab4a94cd56.jar:?]
        at 
org.apache.hudi.hive.HoodieHiveSyncClient.createTable(HoodieHiveSyncClient.java:235)
 ~[_f1c60536b59a86c4fb55c4ab4a94cd56.jar:?]
        at 
org.apache.hudi.hive.HiveSyncTool.syncFirstTime(HiveSyncTool.java:329) 
~[_f1c60536b59a86c4fb55c4ab4a94cd56.jar:?]
        at 
org.apache.hudi.hive.HiveSyncTool.syncHoodieTable(HiveSyncTool.java:251) 
~[_f1c60536b59a86c4fb55c4ab4a94cd56.jar:?]
        at org.apache.hudi.hive.HiveSyncTool.doSync(HiveSyncTool.java:177) 
~[_f1c60536b59a86c4fb55c4ab4a94cd56.jar:?]
        at 
org.apache.hudi.hive.HiveSyncTool.syncHoodieTable(HiveSyncTool.java:165) 
~[_f1c60536b59a86c4fb55c4ab4a94cd56.jar:?]
        at 
org.apache.hudi.sync.common.util.SyncUtilHelpers.runHoodieMetaSync(SyncUtilHelpers.java:79)
 ~[_f1c60536b59a86c4fb55c4ab4a94cd56.jar:?]
        ... 36 more
   2023-11-21 21:25:37,203 INFO  org.apache.spark.deploy.yarn.ApplicationMaster 
              - Final app status: FAILED, exitCode: 15, (reason: User class 
threw exception: org.apache.hudi.exception.HoodieMetaSyncException: Could not 
sync using the meta sync class org.apache.hudi.hive.HiveSyncTool
        at 
org.apache.hudi.sync.common.util.SyncUtilHelpers.runHoodieMetaSync(SyncUtilHelpers.java:81)
        at 
org.apache.hudi.HoodieSparkSqlWriter$.$anonfun$metaSync$2(HoodieSparkSqlWriter.scala:993)
        at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
        at 
org.apache.hudi.HoodieSparkSqlWriter$.metaSync(HoodieSparkSqlWriter.scala:991)
        at 
org.apache.hudi.HoodieSparkSqlWriter$.commitAndPerformPostOperations(HoodieSparkSqlWriter.scala:1089)
        at 
org.apache.hudi.HoodieSparkSqlWriter$.bulkInsertAsRow(HoodieSparkSqlWriter.scala:920)
        at 
org.apache.hudi.HoodieSparkSqlWriter$.writeInternal(HoodieSparkSqlWriter.scala:409)
        at 
org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:132)
        at org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:150)
        at 
org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:46)
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:90)
        at 
org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:180)
        at 
org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:218)
        at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
        at 
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:215)
        at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:176)
        at 
org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:132)
        at 
org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:131)
        at 
org.apache.spark.sql.DataFrameWriter.$anonfun$runCommand$1(DataFrameWriter.scala:989)
        at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
        at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
        at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
        at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772)
        at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
        at 
org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:989)
        at 
org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:438)
        at 
org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:415)
        at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:293)
        at 
com.aliyun.odps.spark.examples.hudi.SparkHudiJson$.main(SparkHudiJson.scala:253)
        at 
com.aliyun.odps.spark.examples.hudi.SparkHudiJson.main(SparkHudiJson.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at 
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:732)
   Caused by: org.apache.hudi.exception.HoodieException: Got runtime exception 
when hive syncing 
hudi_dwd_aplus_pub_log_event_1d_tenant1_project47532305069060_v50_extend_odps_2201_12000_bulkoverwrite
        at 
org.apache.hudi.hive.HiveSyncTool.syncHoodieTable(HiveSyncTool.java:168)
        at 
org.apache.hudi.sync.common.util.SyncUtilHelpers.runHoodieMetaSync(SyncUtilHelpers.java:79)
        ... 36 more
   Caused by: org.apache.hudi.hive.HoodieHiveSyncException: failed to create 
table 
hudi_dwd_aplus_pub_log_event_1d_tenant1_project47532305069060_v50_extend_odps_2201_12000_bulkoverwrite
        at 
org.apache.hudi.hive.ddl.HMSDDLExecutor.createTable(HMSDDLExecutor.java:140)
        at 
org.apache.hudi.hive.HoodieHiveSyncClient.createTable(HoodieHiveSyncClient.java:235)
        at 
org.apache.hudi.hive.HiveSyncTool.syncFirstTime(HiveSyncTool.java:329)
        at 
org.apache.hudi.hive.HiveSyncTool.syncHoodieTable(HiveSyncTool.java:251)
        at org.apache.hudi.hive.HiveSyncTool.doSync(HiveSyncTool.java:177)
        at 
org.apache.hudi.hive.HiveSyncTool.syncHoodieTable(HiveSyncTool.java:165)
        ... 37 more
   Caused by: InvalidObjectException(message:)
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_result$create_table_with_environment_context_resultStandardScheme.read(ThriftHiveMetastore.java:42216)
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_result$create_table_with_environment_context_resultStandardScheme.read(ThriftHiveMetastore.java:42193)
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_result.read(ThriftHiveMetastore.java:42119)
        at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:88)
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_create_table_with_environment_context(ThriftHiveMetastore.java:1203)
        at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.create_table_with_environment_context(ThriftHiveMetastore.java:1189)
        at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.create_table_with_environment_context(HiveMetaStoreClient.java:2405)
        at 
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.create_table_with_environment_context(SessionHiveMetaStoreClient.java:93)
        at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:752)
        at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:740)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at 
org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:173)
        at com.sun.proxy.$Proxy73.createTable(Unknown Source)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient$SynchronizedHandler.invoke(HiveMetaStoreClient.java:2336)
        at com.sun.proxy.$Proxy73.createTable(Unknown Source)
        at 
org.apache.hudi.hive.ddl.HMSDDLExecutor.createTable(HMSDDLExecutor.java:137)
        ... 42 more
   
   hive metastore log  
   2023-11-21T13:25:33,821  INFO [pool-7-thread-193] HiveMetaStore.audit: 
ugi=aaa        ip=100.121.110.144      cmd=source:100.121.110.144 
create_database: Database(name:, description:automatically created by hoodie, 
locationUri:null, parameters:null)
   2023-11-21T13:25:33,827  WARN [pool-7-thread-193] metastore.ObjectStore: 
Failed to get database hive., returning NoSuchObjectException
   2023-11-21T13:25:33,827 ERROR [pool-7-thread-193] 
metastore.RetryingHMSHandler: InvalidObjectException(message: is not a valid 
database name)
           at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_database_core(HiveMetaStore.java:1185)
           at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_database(HiveMetaStore.java:1267)
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at 
org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:147)
           at 
org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:108)
           at com.sun.proxy.$Proxy32.create_database(Unknown Source)
           at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$create_database.getResult(ThriftHiveMetastore.java:14298)
           at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Processor$create_database.getResult(ThriftHiveMetastore.java:14282)
           at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
           at 
org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:111)
           at 
org.apache.hadoop.hive.metastore.TUGIBasedProcessor$1.run(TUGIBasedProcessor.java:107)
           at java.security.AccessController.doPrivileged(Native Method)
           at javax.security.auth.Subject.doAs(Subject.java:422)
           at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1899)
           at 
org.apache.hadoop.hive.metastore.TUGIBasedProcessor.process(TUGIBasedProcessor.java:119)
           at 
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
           at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
           at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
           at java.lang.Thread.run(Thread.java:750)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to