abhiNB-star opened a new issue, #13410:
URL: https://github.com/apache/hudi/issues/13410

   25/06/09 07:35:44 INFO BlockManagerInfo: Added broadcast_39_piece0 in memory 
on 10.51.0.162:34665 (size: 57.5 KiB, free: 992.6 MiB)
   25/06/09 07:35:44 INFO BlockManagerInfo: Added broadcast_39_piece0 in memory 
on 10.51.0.194:35583 (size: 57.5 KiB, free: 998.1 MiB)
   25/06/09 07:35:45 INFO TaskSetManager: Starting task 3.0 in stage 72.0 (TID 
543) (10.51.0.194, executor 2, partition 3, PROCESS_LOCAL, 10100 bytes) 
   25/06/09 07:35:45 WARN TaskSetManager: Lost task 1.0 in stage 72.0 (TID 541) 
(10.51.0.194 executor 2): org.apache.hudi.exception.HoodieException: 
org.apache.hudi.exception.HoodieException: Error occurs when executing map
        at 
java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native
 Method)
        at 
java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(Unknown
 Source)
        at 
java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown
 Source)
        at java.base/java.lang.reflect.Constructor.newInstance(Unknown Source)
        at 
java.base/java.util.concurrent.ForkJoinTask.getThrowableException(Unknown 
Source)
        at java.base/java.util.concurrent.ForkJoinTask.reportException(Unknown 
Source)
        at java.base/java.util.concurrent.ForkJoinTask.invoke(Unknown Source)
        at 
java.base/java.util.stream.ReduceOps$ReduceOp.evaluateParallel(Unknown Source)
        at java.base/java.util.stream.AbstractPipeline.evaluate(Unknown Source)
        at java.base/java.util.stream.ReferencePipeline.collect(Unknown Source)
        at 
org.apache.hudi.common.data.HoodieBaseListData.<init>(HoodieBaseListData.java:46)
        at 
org.apache.hudi.common.data.HoodieListData.<init>(HoodieListData.java:69)
        at 
org.apache.hudi.common.data.HoodieListData.flatMap(HoodieListData.java:136)
        at 
org.apache.hudi.metadata.HoodieBackedTableMetadata.getRecordsByKeyPrefixes(HoodieBackedTableMetadata.java:214)
        at 
org.apache.hudi.metadata.HoodieTableMetadataUtil.lambda$convertMetadataToPartitionStatRecords$4665d616$1(HoodieTableMetadataUtil.java:2700)
        at 
org.apache.hudi.data.HoodieJavaRDD.lambda$mapToPair$aa72055d$1(HoodieJavaRDD.java:173)
        at 
org.apache.spark.api.java.JavaPairRDD$.$anonfun$pairFunToScalaFun$1(JavaPairRDD.scala:1073)
        at scala.collection.Iterator$$anon$10.next(Iterator.scala:461)
        at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486)
        at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492)
        at 
org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:140)
        at 
org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:59)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:104)
        at 
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:54)
        at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166)
        at org.apache.spark.scheduler.Task.run(Task.scala:141)
        at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
        at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
        at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown 
Source)
        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown 
Source)
        at java.base/java.lang.Thread.run(Unknown Source)
   Caused by: org.apache.hudi.exception.HoodieException: Error occurs when 
executing map
        at 
org.apache.hudi.common.function.FunctionWrapper.lambda$throwingMapWrapper$0(FunctionWrapper.java:40)
        at 
org.apache.hudi.common.data.HoodieListData.lambda$flatMap$0(HoodieListData.java:135)
        at java.base/java.util.stream.ReferencePipeline$7$1.accept(Unknown 
Source)
        at 
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(Unknown 
Source)
        at java.base/java.util.stream.AbstractPipeline.copyInto(Unknown Source)
        at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(Unknown 
Source)
        at java.base/java.util.stream.ReduceOps$ReduceTask.doLeaf(Unknown 
Source)
        at java.base/java.util.stream.ReduceOps$ReduceTask.doLeaf(Unknown 
Source)
        at java.base/java.util.stream.AbstractTask.compute(Unknown Source)
        at java.base/java.util.concurrent.CountedCompleter.exec(Unknown Source)
        at java.base/java.util.concurrent.ForkJoinTask.doExec(Unknown Source)
        at 
java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(Unknown 
Source)
        at java.base/java.util.concurrent.ForkJoinPool.scan(Unknown Source)
        at java.base/java.util.concurrent.ForkJoinPool.runWorker(Unknown Source)
        at java.base/java.util.concurrent.ForkJoinWorkerThread.run(Unknown 
Source)
   Caused by: org.apache.hudi.exception.HoodieException: Exception when reading 
log file 
        at 
org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.scanInternalV1(AbstractHoodieLogRecordScanner.java:390)
        at 
org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.scanInternal(AbstractHoodieLogRecordScanner.java:252)
        at 
org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.scanByKeyPrefixes(HoodieMergedLogRecordScanner.java:199)
        at 
org.apache.hudi.metadata.HoodieMetadataLogRecordReader.getRecordsByKeyPrefixes(HoodieMetadataLogRecordReader.java:87)
        at 
org.apache.hudi.metadata.HoodieBackedTableMetadata.readLogRecords(HoodieBackedTableMetadata.java:344)
        at 
org.apache.hudi.metadata.HoodieBackedTableMetadata.lambda$getRecordsByKeyPrefixes$7539c171$1(HoodieBackedTableMetadata.java:234)
        at 
org.apache.hudi.common.function.FunctionWrapper.lambda$throwingMapWrapper$0(FunctionWrapper.java:38)
        ... 14 more
   Caused by: java.lang.ClassCastException: class 
org.apache.avro.generic.GenericData$Record cannot be cast to class 
org.apache.hudi.avro.model.HoodieDeleteRecordList 
(org.apache.avro.generic.GenericData$Record is in unnamed module of loader 
'app'; org.apache.hudi.avro.model.HoodieDeleteRecordList is in unnamed module 
of loader org.apache.spark.util.MutableURLClassLoader @7066107d)
        at 
org.apache.hudi.common.table.log.block.HoodieDeleteBlock.deserialize(HoodieDeleteBlock.java:168)
        at 
org.apache.hudi.common.table.log.block.HoodieDeleteBlock.getRecordsToDelete(HoodieDeleteBlock.java:123)
        at 
org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.processQueuedBlocksForInstant(AbstractHoodieLogRecordScanner.java:680)
        at 
org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.scanInternalV1(AbstractHoodieLogRecordScanner.java:380)
        ... 20 more
   
   25/06/09 07:35:46 INFO TaskSetManager: Starting task 1.1 in stage 72.0 (TID 
544) (10.51.0.162, executor 3, partition 1, PROCESS_LOCAL, 10100 bytes) 
   25/06/09 07:35:46 INFO TaskSetManager: Lost task 2.0 in stage 72.0 (TID 542) 
on 10.51.0.162, executor 3: org.apache.hudi.exception.HoodieException 
(org.apache.hudi.exception.HoodieException: Error occurs when executing map) 
[duplicate 1]
   25/06/09 07:35:46 INFO TaskSetManager: Starting task 2.1 in stage 72.0 (TID 
545) (10.51.0.2, executor 1, partition 2, PROCESS_LOCAL, 10100 bytes) 
   25/06/09 07:35:46 WARN TaskSetManager: Lost task 0.0 in stage 72.0 (TID 540) 
(10.51.0.2 executor 1): org.apache.hudi.exception.HoodieException: Error occurs 
when executing map
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to