eric9204 opened a new issue, #6966:
URL: https://github.com/apache/hudi/issues/6966

   **_Tips before filing an issue_**
   
   - Have you gone through our [FAQs](https://hudi.apache.org/learn/faq/)?
   
   - Join the mailing list to engage in conversations and get faster support at 
[email protected].
   
   - If you have triaged this as a bug, then file an 
[issue](https://issues.apache.org/jira/projects/HUDI/issues) directly.
   
   **Describe the problem you faced**
   
   spark write to hudi, Error occurred when configuring the following 
parameters.
   
   **To Reproduce**
   
   Steps to reproduce the behavior:
   
   ```
   hoodie.datasource.write.operation=insert
   hoodie.datasource.write.table.type=MERGE_ON_READ
   hoodie.datasource.write.precombine.field=ts
   hoodie.datasource.write.recordkey.field=id
   hoodie.datasource.write.partitionpath.field=part
   hoodie.table.name=ss_bucket_dsj_parquet_12
   
hoodie.datasource.write.keygenerator.class=org.apache.hudi.keygen.SimpleKeyGenerator
   hoodie.insert.shuffle.parallelism=8
   hoodie.datasource.compaction.async.enable=true
   hoodie.compact.inline.max.delta.commits=4
   hoodie.index.type=BUCKET
   hoodie.bucket.index.num.buckets=8
   hoodie.bucket.index.hash.field=id
   
hoodie.storage.layout.partitioner.class=org.apache.hudi.table.action.commit.SparkBucketIndexPartitioner
   hoodie.storage.layout.type=BUCKET
   hoodie.metadata.enable=true
   hoodie.embed.timeline.server=false
   path=/tmp/hudi/ss_bucket_dsj_parquet_12
   checkpointLocation=/tmp/hudi/ckp
   hoodie.datasource.hive_sync.enable=true
   hoodie.datasource.hive_sync.username=ocdp
   hoodie.datasource.hive_sync.database=default
   hoodie.datasource.hive_sync.table=ss_bucket_dsj_parquet_12
   hoodie.datasource.hive_sync.password=ocdp
   hoodie.datasource.hive_sync.jdbcurl=jdbc:hive2://10.1.9.44:10000
   
hoodie.datasource.hive_sync.partition_extractor_class=org.apache.hudi.hive.MultiPartKeysValueExtractor
   hoodie.datasource.hive_sync.partition_fields=part
   hoodie.datasource.write.hive_style_partitioning=true
   hoodie.datasource.hive_sync.bucket_sync=true
   
   hoodie.datasource.write.drop.partition.columns=true
   ```
   
   **Expected behavior**
   
   A clear and concise description of what you expected to happen.
   
   **Environment Description**
   
   * Hudi version :
      Hudi-0.12.0
   * Spark version :
      Spark-3.1.1
   * Hive version :
      None
   * Hadoop version :
      Hadoop-3.3.0
   * Storage (HDFS/S3/GCS..) :
      Hdfs
   * Running on Docker? (yes/no) :
      No
   
   **Additional context**
   
   
   
   **Stacktrace**
   
   ```
   22/10/17 11:04:08 ERROR HoodieWriteHandle: Error writing record 
HoodieRecord{key=HoodieKey { recordKey=id438436 
partitionPath=part=202210171102}, currentLocation='null', newLocation='null'}
   java.io.EOFException
           at 
org.apache.avro.io.BinaryDecoder.ensureBounds(BinaryDecoder.java:473)
           at org.apache.avro.io.BinaryDecoder.readInt(BinaryDecoder.java:128)
           at org.apache.avro.io.BinaryDecoder.readIndex(BinaryDecoder.java:423)
           at 
org.apache.avro.io.ResolvingDecoder.doAction(ResolvingDecoder.java:290)
           at org.apache.avro.io.parsing.Parser.advance(Parser.java:88)
           at 
org.apache.avro.io.ResolvingDecoder.readIndex(ResolvingDecoder.java:267)
           at 
org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:179)
           at 
org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:153)
           at 
org.apache.avro.generic.GenericDatumReader.readField(GenericDatumReader.java:232)
           at 
org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:222)
           at 
org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:175)
           at 
org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:153)
           at 
org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:145)
           at 
org.apache.hudi.avro.HoodieAvroUtils.bytesToAvro(HoodieAvroUtils.java:158)
           at 
org.apache.hudi.avro.HoodieAvroUtils.bytesToAvro(HoodieAvroUtils.java:148)
           at 
org.apache.hudi.common.model.OverwriteWithLatestAvroPayload.getInsertValue(OverwriteWithLatestAvroPayload.java:75)
           at 
org.apache.hudi.common.model.HoodieRecordPayload.getInsertValue(HoodieRecordPayload.java:105)
           at 
org.apache.hudi.execution.HoodieLazyInsertIterable$HoodieInsertValueGenResult.<init>(HoodieLazyInsertIterable.java:90)
           at 
org.apache.hudi.execution.HoodieLazyInsertIterable.lambda$getTransformFunction$0(HoodieLazyInsertIterable.java:103)
           at 
org.apache.hudi.common.util.queue.BoundedInMemoryQueue.insertRecord(BoundedInMemoryQueue.java:190)
           at 
org.apache.hudi.common.util.queue.IteratorBasedQueueProducer.produce(IteratorBasedQueueProducer.java:46)
           at 
org.apache.hudi.common.util.queue.BoundedInMemoryExecutor.lambda$null$0(BoundedInMemoryExecutor.java:106)
           at java.util.concurrent.FutureTask.run(FutureTask.java:266)
           at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
           at java.util.concurrent.FutureTask.run(FutureTask.java:266)
           at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
           at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
           at java.lang.Thread.run(Thread.java:748)
   ```
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to