fengjian428 commented on issue #7654:
URL: https://github.com/apache/hudi/issues/7654#issuecomment-1397981689

   > @fengjian428 , sure.. Please feel free to include the test code. Thanks 
for the quick fix.. I tried it out a snapshot built from your branch 
clean_deadlock, did not come across deadlocks anymore with FS based lock 
provider.
   > 
   > However, I bumped up the writers to 100 and ran into the below. **I have 
not used FileSystemLockExpire.**
   > 
   > ```
   > 2023-01-19 02:04:40,411 [INFO  ] HoodieMergeHandle - Merging new data into 
oldPath 
/Users/xxxx/IdeaProjects/ApacheHudiOccTest/occ/tmp/hudiTest/44/test/5dd8fea5-cf44-4432-b775-01cb67d1250d-0_0-0-0_20230119020209727.parquet,
 as newPath 
/Users/xxxx/IdeaProjects/ApacheHudiOccTest/occ/tmp/hudiTest/44/test/5dd8fea5-cf44-4432-b775-01cb67d1250d-0_0-0-0_20230119020440202.parquet
   > 2023-01-19 02:04:40,412 [INFO  ] DirectWriteMarkers - Creating Marker 
Path=/Users/xxxx/IdeaProjects/ApacheHudiOccTest/occ/tmp/hudiTest/.hoodie/.temp/20230119020440202/44/test/5dd8fea5-cf44-4432-b775-01cb67d1250d-0_0-0-0_20230119020440202.parquet.marker.MERGE
   > 
   > org.apache.hudi.exception.HoodieUpsertException: Failed upsert schema 
compatibility check
   > 
   >    at 
org.apache.hudi.table.HoodieTable.validateUpsertSchema(HoodieTable.java:820)
   >    at 
org.apache.hudi.client.HoodieJavaWriteClient.upsert(HoodieJavaWriteClient.java:109)
   >    at org.example.HudiOccTest.lambda$HudiTest$2(HudiOccTest.java:213)
   >    at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
   >    at 
java.base/java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948)
   >    at 
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
   >    at 
java.base/java.util.stream.ForEachOps$ForEachTask.compute(ForEachOps.java:290)
   >    at 
java.base/java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:746)
   >    at 
java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
   >    at 
java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
   >    at 
java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
   >    at 
java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
   >    at 
java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
   > Caused by: org.apache.hudi.exception.HoodieException: Failed to read 
schema/check compatibility for base path 
/Users/xxxx/IdeaProjects/ApacheHudiOccTest/occ/tmp/hudiTest
   >    at 
org.apache.hudi.table.HoodieTable.validateSchema(HoodieTable.java:807)
   >    at 
org.apache.hudi.table.HoodieTable.validateUpsertSchema(HoodieTable.java:818)
   >    ... 12 more
   > Caused by: org.apache.hudi.exception.HoodieIOException: Could not read 
commit details from 
/Users/xxxx/IdeaProjects/ApacheHudiOccTest/occ/tmp/hudiTest/.hoodie/20230119020417275.commit
   >    at 
org.apache.hudi.common.table.timeline.HoodieActiveTimeline.readDataFromPath(HoodieActiveTimeline.java:824)
   >    at 
org.apache.hudi.common.table.timeline.HoodieActiveTimeline.getInstantDetails(HoodieActiveTimeline.java:310)
   >    at 
org.apache.hudi.common.table.timeline.HoodieActiveTimeline.lambda$getCommitMetadataStream$2(HoodieActiveTimeline.java:349)
   >    at 
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195)
   >    at 
java.base/java.util.stream.SortedOps$SizedRefSortingSink.end(SortedOps.java:361)
   >    at 
java.base/java.util.stream.AbstractPipeline.copyIntoWithCancel(AbstractPipeline.java:503)
   >    at 
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:488)
   >    at 
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)
   >    at 
java.base/java.util.stream.FindOps$FindOp.evaluateSequential(FindOps.java:150)
   >    at 
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
   >    at 
java.base/java.util.stream.ReferencePipeline.findFirst(ReferencePipeline.java:543)
   >    at 
org.apache.hudi.common.table.timeline.HoodieActiveTimeline.getLastCommitMetadataWithValidSchema(HoodieActiveTimeline.java:321)
   >    at 
org.apache.hudi.common.table.TableSchemaResolver.getLatestCommitMetadataWithValidSchema(TableSchemaResolver.java:491)
   >    at 
org.apache.hudi.common.table.TableSchemaResolver.getTableSchemaFromLatestCommitMetadata(TableSchemaResolver.java:225)
   >    at 
org.apache.hudi.common.table.TableSchemaResolver.getTableAvroSchemaInternal(TableSchemaResolver.java:199)
   >    at 
org.apache.hudi.common.table.TableSchemaResolver.getTableAvroSchema(TableSchemaResolver.java:139)
   >    at 
org.apache.hudi.common.table.TableSchemaResolver.getTableAvroSchemaWithoutMetadataFields(TableSchemaResolver.java:192)
   >    at 
org.apache.hudi.table.HoodieTable.validateSchema(HoodieTable.java:804)
   >    ... 13 more
   > Caused by: java.io.FileNotFoundException: File 
file:/Users/xxxx/IdeaProjects/ApacheHudiOccTest/occ/tmp/hudiTest/.hoodie/20230119020417275.commit
 does not exist
   > ```
   Do you mean to increase the numHudiWriteClients to 100?
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to