ssomuah opened a new issue #1836:
URL: https://github.com/apache/hudi/issues/1836


   **Describe the problem you faced**
   
   It appears that I'm unable to instantiate my custom 
"hoodie.datasource.write.payload.class" I get an exception saying
   
   Caused by: java.lang.NoSuchMethodException: 
com.myCustompayloadClass.<init>(org.apache.hudi.common.util.Option)
   
   The constructor for my class is 
   ```
   class myCustompayloadClass(rec: GenericRecord, oVal: Comparable[_])
     extends BaseAvroPayload(rec, oVal) with 
HoodieRecordPayload[myCustompayloadClass] {
   ```
   
   
   
   **To Reproduce**
   
   Steps to reproduce the behavior:
   
   1.Spark-submit an application that uses a MOR table and custom payload class
   
   
   **Expected behavior**
   
   Application should be compact MOR table
   
   **Environment Description**
   
   * Hudi version : Master @ 3b9a30528bd6a6369181702303f3384162b04a7f 
https://github.com/apache/hudi/tree/3b9a30528bd6a6369181702303f3384162b04a7f
   
   * Spark version :2.4.4
   
   * Hive version : N/A
   
   * Hadoop version :2.7.3
   
   * Storage (HDFS/S3/GCS..) : ABFSS
   
   * Running on Docker? (yes/no) :yes
   
   
   **Additional context**
   
   It's running on Databricks.
   
   We build a fat jar with hudi-spark included and use it as the argument to 
the spark-submit. 
   
   **Stacktrace**
   
   ```Add the stacktrace of the error.```
   
   Driver Log 
   
   org.apache.hudi.exception.HoodieIOException: IOException when reading log 
file 
        at 
org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.scan(AbstractHoodieLogRecordScanner.java:245)
        at 
org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.<init>(HoodieMergedLogRecordScanner.java:81)
        at 
org.apache.hudi.table.action.compact.HoodieMergeOnReadTableCompactor.compact(HoodieMergeOnReadTableCompactor.java:127)
        at 
org.apache.hudi.table.action.compact.HoodieMergeOnReadTableCompactor.lambda$compact$644ebad7$1(HoodieMergeOnReadTableCompactor.java:98)
        at 
org.apache.spark.api.java.JavaPairRDD$$anonfun$toScalaFunction$1.apply(JavaPairRDD.scala:1043)
   
   
   
   Executor Log
   
   20/07/15 13:11:13 ERROR AbstractHoodieLogRecordScanner: Got exception when 
reading log file
   org.apache.hudi.exception.HoodieException: Unable to instantiate payload 
class
           at 
org.apache.hudi.common.util.ReflectionUtils.loadPayload(ReflectionUtils.java:78)
           at 
org.apache.hudi.common.util.SpillableMapUtils.convertToHoodieRecordPayload(SpillableMapUtils.java:116)
           at 
org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.processDataBlock(AbstractHoodieLogRecordScanner.java:277)
           at 
org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.processQueuedBlocksForInstant(AbstractHoodieLogRecordScanner.java:306)
           at 
org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.scan(AbstractHoodieLogRecordScanner.java:239)
           at 
org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.<init>(HoodieMergedLogRecordScanner.java:81)
           at 
org.apache.hudi.table.action.compact.HoodieMergeOnReadTableCompactor.compact(HoodieMergeOnReadTableCompactor.java:127)
           at 
org.apache.hudi.table.action.compact.HoodieMergeOnReadTableCompactor.lambda$compact$644ebad7$1(HoodieMergeOnReadTableCompactor.java:98)
           at 
org.apache.spark.api.java.JavaPairRDD$$anonfun$toScalaFunction$1.apply(JavaPairRDD.scala:1043)
           at scala.collection.Iterator$$anon$11.next(Iterator.scala:410)
           at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)
           at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)
           at 
org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221)
           at 
org.apache.spark.storage.memory.MemoryStore.putIteratorAsBytes(MemoryStore.scala:349)
           at 
org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1252)
           at 
org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1226)
           at 
org.apache.spark.storage.BlockManager.org$apache$spark$storage$BlockManager$$doPut(BlockManager.scala:1161)
           at 
org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1226)
           at 
org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:1045)
           at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:364)
           at org.apache.spark.rdd.RDD.iterator(RDD.scala:315)
           at 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:60)
           at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:353)
           at org.apache.spark.rdd.RDD.iterator(RDD.scala:317)
           at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
           at org.apache.spark.scheduler.Task.doRunTask(Task.scala:140)
           at org.apache.spark.scheduler.Task.run(Task.scala:113)
           at 
org.apache.spark.executor.Executor$TaskRunner$$anonfun$13.apply(Executor.scala:537)
           at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1541)
           at 
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:543)
           at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
           at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
           at java.lang.Thread.run(Thread.java:748)
   Caused by: java.lang.NoSuchMethodException: 
com.myCustompayloadClass.<init>(org.apache.hudi.common.util.Option)
           at java.lang.Class.getConstructor0(Class.java:3082)
           at java.lang.Class.getConstructor(Class.java:1825)
           at 
org.apache.hudi.common.util.ReflectionUtils.loadPayload(ReflectionUtils.java:76)
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to