Armelabdelkbir opened a new issue, #7557:
URL: https://github.com/apache/hudi/issues/7557

   
   
   Hello, community 
   after migration to  hudi 0.11.0 few months, and fixing some previous issues, 
my CDC jobs  works fine for a weeks, but recently all my jobs are broken with 
an  issue with hive sync , the same configuration and workflow works fine  for 
a months
   
   **To Reproduce**
   
   Steps to reproduce the behavior:
   
   1. enable hive sync 
   2. start a spark hudi job
   
   
   **Expected behavior**
   
   create an _ro and _rt tables
   
   **Environment Description**
   
       *Hudi version : 0.11.0
       *Spark version : 3.1.4
        *Hive version : 1.2.1000
        *Storage (HDFS) : 2.7.3
   
   
   
   **hudi hive conf **
        DataSourceWriteOptions.HIVE_SYNC_ENABLED.key -> "true",
         "hoodie.datasource.hive_sync.use_jdbc" -> "false",
         DataSourceWriteOptions.HIVE_SYNC_MODE.key -> "hms",
         DataSourceWriteOptions.HIVE_DATABASE.key -> table.db_name,
         DataSourceWriteOptions.HIVE_URL.key -> "thrift://mnode1.datalake:9083",
         DataSourceWriteOptions.HIVE_PARTITION_EXTRACTOR_CLASS.key -> 
config.hudiConf.key_generator_hive ,
         DataSourceWriteOptions.HIVE_TABLE.key -> table.table_name,
         "hoodie.datasource.hive_sync.ignore_exceptions" -> "true",
         "
   
   
   **Stacktrace**
   
   ```
           at 
org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:356)
           at 
org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run(StreamExecution.scala:244)
   Caused by: org.apache.hudi.exception.HoodieException: Could not sync using 
the meta sync class org.apache.hudi.hive.HiveSyncTool
           at 
org.apache.hudi.sync.common.util.SyncUtilHelpers.runHoodieMetaSync(SyncUtilHelpers.java:61)
           at 
org.apache.hudi.HoodieSparkSqlWriter$.$anonfun$metaSync$2(HoodieSparkSqlWriter.scala:622)
           at 
org.apache.hudi.HoodieSparkSqlWriter$.$anonfun$metaSync$2$adapted(HoodieSparkSqlWriter.scala:621)
           at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
           at 
org.apache.hudi.HoodieSparkSqlWriter$.metaSync(HoodieSparkSqlWriter.scala:621)
           at 
org.apache.hudi.HoodieSparkSqlWriter$.commitAndPerformPostOperations(HoodieSparkSqlWriter.scala:680)
           at 
org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:313)
           at 
org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:163)
        at 
org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:317)
           ... 1 more
   Caused by: org.apache.hudi.exception.HoodieException: Unable to instantiate 
class org.apache.hudi.hive.HiveSyncTool
           at 
org.apache.hudi.common.util.ReflectionUtils.loadClass(ReflectionUtils.java:91)
           at 
org.apache.hudi.sync.common.util.SyncUtilHelpers.instantiateMetaSyncTool(SyncUtilHelpers.java:78)
           at 
org.apache.hudi.sync.common.util.SyncUtilHelpers.runHoodieMetaSync(SyncUtilHelpers.java:59)
           ... 56 more
   Caused by: java.lang.reflect.InvocationTargetException
           at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
Method)
           at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
           at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
           at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
           at 
org.apache.hudi.common.util.ReflectionUtils.loadClass(ReflectionUtils.java:89)
           ... 58 more
   ```
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to