ankit0811 commented on issue #11036:
URL: https://github.com/apache/hudi/issues/11036#issuecomment-2237628185

   @ad1happy2go that seems to have done the trick. Thanks
   
   However, we did see another exception post above change
   
   ```
   at 
org.apache.flink.runtime.operators.coordination.OperatorCoordinatorHolder$LazyInitializedCoordinatorContext.failJob(OperatorCoordinatorHolder.java:556)
        at 
org.apache.hudi.sink.StreamWriteOperatorCoordinator.lambda$start$0(StreamWriteOperatorCoordinator.java:199)
        at 
org.apache.hudi.sink.utils.NonThrownExecutor.handleException(NonThrownExecutor.java:142)
        at 
org.apache.hudi.sink.utils.NonThrownExecutor.lambda$wrapAction$0(NonThrownExecutor.java:133)
        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown 
Source)
        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown 
Source)
        at java.base/java.lang.Thread.run(Unknown Source)
   Caused by: org.apache.hudi.exception.HoodieException: Executor executes 
action [handle write metadata event for instant ] error
        ... 6 more
   Caused by: org.apache.hudi.exception.HoodieUpsertException: Error upserting 
bucketType UPDATE for partition :files
        at 
org.apache.hudi.table.action.commit.BaseFlinkCommitActionExecutor.handleUpsertPartition(BaseFlinkCommitActionExecutor.java:167)
        at 
org.apache.hudi.table.action.commit.BaseFlinkCommitActionExecutor.execute(BaseFlinkCommitActionExecutor.java:101)
        at 
org.apache.hudi.table.action.commit.delta.FlinkUpsertPreppedDeltaCommitActionExecutor.execute(FlinkUpsertPreppedDeltaCommitActionExecutor.java:52)
        at 
org.apache.hudi.table.HoodieFlinkMergeOnReadTable.upsertPrepped(HoodieFlinkMergeOnReadTable.java:85)
        at 
org.apache.hudi.client.HoodieFlinkWriteClient.lambda$upsertPreppedRecords$4(HoodieFlinkWriteClient.java:168)
        at java.base/java.util.stream.ReferencePipeline$3$1.accept(Unknown 
Source)
        at 
java.base/java.util.HashMap$ValueSpliterator.forEachRemaining(Unknown Source)
        at java.base/java.util.stream.AbstractPipeline.copyInto(Unknown Source)
        at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(Unknown 
Source)
        at java.base/java.util.stream.ReduceOps$ReduceTask.doLeaf(Unknown 
Source)
        at java.base/java.util.stream.ReduceOps$ReduceTask.doLeaf(Unknown 
Source)
        at java.base/java.util.stream.AbstractTask.compute(Unknown Source)
        at java.base/java.util.concurrent.CountedCompleter.exec(Unknown Source)
        at java.base/java.util.concurrent.ForkJoinTask.doExec(Unknown Source)
        at java.base/java.util.concurrent.ForkJoinTask.doInvoke(Unknown Source)
        at java.base/java.util.concurrent.ForkJoinTask.invoke(Unknown Source)
        at 
java.base/java.util.stream.ReduceOps$ReduceOp.evaluateParallel(Unknown Source)
        at java.base/java.util.stream.AbstractPipeline.evaluate(Unknown Source)
        at java.base/java.util.stream.ReferencePipeline.collect(Unknown Source)
        at 
org.apache.hudi.client.HoodieFlinkWriteClient.upsertPreppedRecords(HoodieFlinkWriteClient.java:171)
        at 
org.apache.hudi.client.HoodieFlinkWriteClient.upsertPreppedRecords(HoodieFlinkWriteClient.java:75)
        at 
org.apache.hudi.metadata.FlinkHoodieBackedTableMetadataWriter.commitInternal(FlinkHoodieBackedTableMetadataWriter.java:162)
        at 
org.apache.hudi.metadata.FlinkHoodieBackedTableMetadataWriter.commit(FlinkHoodieBackedTableMetadataWriter.java:103)
        at 
org.apache.hudi.metadata.HoodieBackedTableMetadataWriter.processAndCommit(HoodieBackedTableMetadataWriter.java:962)
        at 
org.apache.hudi.metadata.HoodieBackedTableMetadataWriter.updateFromWriteStatuses(HoodieBackedTableMetadataWriter.java:1019)
        at 
org.apache.hudi.client.BaseHoodieClient.writeTableMetadata(BaseHoodieClient.java:289)
        at 
org.apache.hudi.client.BaseHoodieWriteClient.commit(BaseHoodieWriteClient.java:287)
        at 
org.apache.hudi.client.BaseHoodieWriteClient.commitStats(BaseHoodieWriteClient.java:237)
        at 
org.apache.hudi.client.HoodieFlinkWriteClient.commit(HoodieFlinkWriteClient.java:112)
        at 
org.apache.hudi.client.HoodieFlinkWriteClient.commit(HoodieFlinkWriteClient.java:75)
        at 
org.apache.hudi.client.BaseHoodieWriteClient.commit(BaseHoodieWriteClient.java:202)
        at 
org.apache.hudi.sink.StreamWriteOperatorCoordinator.doCommit(StreamWriteOperatorCoordinator.java:581)
        at 
org.apache.hudi.sink.StreamWriteOperatorCoordinator.commitInstant(StreamWriteOperatorCoordinator.java:557)
        at 
org.apache.hudi.sink.StreamWriteOperatorCoordinator.commitInstant(StreamWriteOperatorCoordinator.java:526)
        at 
org.apache.hudi.sink.StreamWriteOperatorCoordinator.initInstant(StreamWriteOperatorCoordinator.java:421)
        at 
org.apache.hudi.sink.StreamWriteOperatorCoordinator.handleBootstrapEvent(StreamWriteOperatorCoordinator.java:453)
        at 
org.apache.hudi.sink.StreamWriteOperatorCoordinator.lambda$handleEventFromOperator$4(StreamWriteOperatorCoordinator.java:294)
        at 
org.apache.hudi.sink.utils.NonThrownExecutor.lambda$wrapAction$0(NonThrownExecutor.java:130)
        ... 3 more
   Caused by: java.lang.NoClassDefFoundError: Could not initialize class 
org.apache.hudi.org.apache.hadoop.hbase.io.hfile.HFile
        at 
org.apache.hudi.common.util.HFileUtils.serializeRecordsToLogBlock(HFileUtils.java:210)
        at 
org.apache.hudi.common.table.log.block.HoodieHFileDataBlock.serializeRecords(HoodieHFileDataBlock.java:108)
        at 
org.apache.hudi.common.table.log.block.HoodieDataBlock.getContentBytes(HoodieDataBlock.java:148)
        at 
org.apache.hudi.common.table.log.HoodieLogFormatWriter.appendBlocks(HoodieLogFormatWriter.java:151)
        at 
org.apache.hudi.io.HoodieAppendHandle.appendDataAndDeleteBlocks(HoodieAppendHandle.java:474)
        at 
org.apache.hudi.io.HoodieAppendHandle.doAppend(HoodieAppendHandle.java:445)
        at 
org.apache.hudi.table.action.commit.delta.BaseFlinkDeltaCommitActionExecutor.handleUpdate(BaseFlinkDeltaCommitActionExecutor.java:54)
        at 
org.apache.hudi.table.action.commit.BaseFlinkCommitActionExecutor.handleUpsertPartition(BaseFlinkCommitActionExecutor.java:159)
        ... 40 more
   ```
        
   The Hfile class seems to present in the flink hudi bundled jar so not sure 
whats causing this. Any pointers?
   as a resolution, we ended up loading hbase-server lib in the class path on 
the server
   
   
   Another issue that we keep on seeing is 
   ```
   org.apache.hudi.exception.HoodieException: Timeout(21000ms) while waiting 
for instant initialize
        at org.apache.hudi.sink.utils.TimeWait.waitFor(TimeWait.java:57)
        at 
org.apache.hudi.sink.common.AbstractStreamWriteFunction.instantToWrite(AbstractStreamWriteFunction.java:270)
        at 
org.apache.hudi.sink.append.AppendWriteFunction.initWriterHelper(AppendWriteFunction.java:138)
        at 
org.apache.hudi.sink.append.AppendWriteFunction.processElement(AppendWriteFunction.java:97)
        at 
org.apache.flink.streaming.api.operators.ProcessOperator.processElement(ProcessOperator.java:66)
        at 
org.apache.flink.streaming.runtime.tasks.OneInputStreamTask$StreamTaskNetworkOutput.emitRecord(OneInputStreamTask.java:233)
        at 
org.apache.flink.streaming.runtime.io.AbstractStreamTaskNetworkInput.processElement(AbstractStreamTaskNetworkInput.java:134)
        at 
org.apache.flink.streaming.runtime.io.AbstractStreamTaskNetworkInput.emitNext(AbstractStreamTaskNetworkInput.java:105)
        at 
org.apache.flink.streaming.runtime.io.StreamOneInputProcessor.processInput(StreamOneInputProcessor.java:65)
        at 
org.apache.flink.streaming.runtime.tasks.StreamTask.processInput(StreamTask.java:519)
        at 
org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:203)
        at 
org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:804)
        at 
org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:753)
        at 
org.apache.flink.runtime.taskmanager.Task.runWithSystemExitMonitoring(Task.java:948)
        at 
org.apache.flink.runtime.taskmanager.Task.restoreAndInvoke(Task.java:927)
        at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:741)
        at org.apache.flink.runtime.taskmanager.Task.run(Task.java:563)
        at java.base/java.lang.Thread.run(Unknown Source)
   
   ```
   
   Any pointers or config that you seems we might have set incorrectly?
   
   Thanks
   Ankit


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to