lokeshj1703 opened a new pull request, #13098:
URL: https://github.com/apache/hudi/pull/13098

   ### Change Logs
   
   While downgrading archived timeline, EightToSevenDowngradeHandler hits an 
error where it is trying to get instant details from active timeline instead of 
archive timeline.
   
   ```
   25/04/07 15:26:47 WARN UpgradeDowngrade: Unable to upgrade or downgrade the 
metadata table to version SIX, ignoring the error and continue.
   org.apache.hudi.exception.HoodieException: Failed to downgrade LSM timeline 
to old archived format
       at 
org.apache.hudi.table.upgrade.EightToSevenDowngradeHandler.downgradeFromLSMTimeline(EightToSevenDowngradeHandler.java:232)
       at 
org.apache.hudi.table.upgrade.EightToSevenDowngradeHandler.downgrade(EightToSevenDowngradeHandler.java:131)
       at 
org.apache.hudi.table.upgrade.UpgradeDowngrade.downgrade(UpgradeDowngrade.java:275)
       at 
org.apache.hudi.table.upgrade.UpgradeDowngrade.run(UpgradeDowngrade.java:181)
       at 
org.apache.hudi.table.upgrade.UpgradeDowngrade.run(UpgradeDowngrade.java:150)
       at 
$line30.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$DowngradeTable$.downgradeTable(../src/main/scala/com/hudi/spark/DowngradeTable.scala:60)
       at 
$line31.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:46)
       at 
$line31.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:50)
       at 
$line31.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:52)
       at 
$line31.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:54)
       at 
$line31.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:56)
       at $line31.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:58)
       at $line31.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:60)
       at $line31.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:62)
       at $line31.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:64)
       at $line31.$read$$iw$$iw$$iw$$iw$$iw.<init>(<console>:66)
       at $line31.$read$$iw$$iw$$iw$$iw.<init>(<console>:68)
       at $line31.$read$$iw$$iw$$iw.<init>(<console>:70)
       at $line31.$read$$iw$$iw.<init>(<console>:72)
       at $line31.$read$$iw.<init>(<console>:74)
       at $line31.$read.<init>(<console>:76)
       at $line31.$read$.<init>(<console>:80)
       at $line31.$read$.<clinit>(<console>)
       at $line31.$eval$.$print$lzycompute(<console>:7)
       at $line31.$eval$.$print(<console>:6)
       at $line31.$eval.$print(<console>)
       at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
       at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
       at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
       at java.base/java.lang.reflect.Method.invoke(Method.java:566)
       at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:747)
       at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1020)
       at 
scala.tools.nsc.interpreter.IMain.$anonfun$interpret$1(IMain.scala:568)
       at 
scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:36)
       at 
scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:116)
       at 
scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:41)
       at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:567)
       at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:594)
       at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:564)
       at 
scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:865)
       at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:733)
       at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:435)
       at scala.tools.nsc.interpreter.ILoop.loop(ILoop.scala:456)
       at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:239)
       at org.apache.spark.repl.Main$.doMain(Main.scala:78)
       at org.apache.spark.repl.Main$.main(Main.scala:58)
       at org.apache.spark.repl.Main.main(Main.scala)
       at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
       at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
       at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
       at java.base/java.lang.reflect.Method.invoke(Method.java:566)
       at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
       at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:1020)
       at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:192)
       at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:215)
       at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91)
       at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1111)
       at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1120)
       at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   Caused by: org.apache.hudi.exception.HoodieIOException: Could not read 
commit details from 
/tmp/output/20250407/table_comp_test_0_14_1_1_1_0-SNAPSHOT_1744019628/.hoodie/metadata/.hoodie/timeline/00000000000000000_20250407152558742.deltacommit
       at 
org.apache.hudi.common.table.timeline.versioning.v2.ActiveTimelineV2.readDataStreamFromPath(ActiveTimelineV2.java:728)
       at 
org.apache.hudi.common.table.timeline.versioning.v2.ActiveTimelineV2.getContentStream(ActiveTimelineV2.java:271)
       at 
org.apache.hudi.common.table.timeline.BaseHoodieTimeline.getInstantContentStream(BaseHoodieTimeline.java:558)
       at 
org.apache.hudi.common.table.timeline.HoodieTimeline.readInstantContent(HoodieTimeline.java:138)
       at 
org.apache.hudi.common.table.timeline.MetadataConversionUtils.getCommitMetadata(MetadataConversionUtils.java:342)
       at 
org.apache.hudi.common.table.timeline.MetadataConversionUtils.createMetaWrapper(MetadataConversionUtils.java:201)
       at 
org.apache.hudi.table.upgrade.EightToSevenDowngradeHandler$ArchiveEntryFlusher.accept(EightToSevenDowngradeHandler.java:262)
       at 
org.apache.hudi.table.upgrade.EightToSevenDowngradeHandler$ArchiveEntryFlusher.accept(EightToSevenDowngradeHandler.java:240)
       at 
org.apache.hudi.common.table.timeline.versioning.v2.ArchivedTimelineLoaderV2.lambda$loadInstants$1(ArchivedTimelineLoaderV2.java:73)
       at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
       at 
java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:177)
       at 
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655)
       at 
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
       at 
java.base/java.util.stream.ForEachOps$ForEachTask.compute(ForEachOps.java:290)
       at 
java.base/java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:746)
       at 
java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
       at 
java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
       at 
java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
       at 
java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
       at 
java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
   Caused by: java.io.FileNotFoundException: File 
file:/tmp/output/20250407/table_comp_test_0_14_1_1_1_0-SNAPSHOT_1744019628/.hoodie/metadata/.hoodie/timeline/00000000000000000_20250407152558742.deltacommit
 does not exist
       at 
org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:779)
       at 
org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:1100)
       at 
org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:769)
       at 
org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:462)
       at 
org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker.<init>(ChecksumFileSystem.java:160)
       at 
org.apache.hadoop.fs.ChecksumFileSystem.open(ChecksumFileSystem.java:372)
       at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:976)
       at 
org.apache.hudi.hadoop.fs.HoodieWrapperFileSystem.open(HoodieWrapperFileSystem.java:481)
       at 
org.apache.hudi.storage.hadoop.HoodieHadoopStorage.open(HoodieHadoopStorage.java:149)
       at 
org.apache.hudi.common.table.timeline.versioning.v2.ActiveTimelineV2.readDataStreamFromPath(ActiveTimelineV2.java:726)
       ... 19 more
   25/04/07 15:26:48 WARN EightToSevenDowngradeHandler: Failed to downgrade LSM 
timeline to old archived format
   org.apache.hudi.exception.HoodieException: Failed to downgrade LSM timeline 
to old archived format
     at 
org.apache.hudi.table.upgrade.EightToSevenDowngradeHandler.downgradeFromLSMTimeline(EightToSevenDowngradeHandler.java:232)
     at 
org.apache.hudi.table.upgrade.EightToSevenDowngradeHandler.downgrade(EightToSevenDowngradeHandler.java:131)
     at 
org.apache.hudi.table.upgrade.UpgradeDowngrade.downgrade(UpgradeDowngrade.java:275)
     at 
org.apache.hudi.table.upgrade.UpgradeDowngrade.run(UpgradeDowngrade.java:181)
     at 
DowngradeTable$.downgradeTable(../src/main/scala/com/hudi/spark/DowngradeTable.scala:60)
     ... 53 elided
   Caused by: org.apache.hudi.exception.HoodieIOException: Could not read 
commit details from 
/tmp/output/20250407/table_comp_test_0_14_1_1_1_0-SNAPSHOT_1744019628/.hoodie/timeline/20250407152428412_20250407152459189.commit
     at 
org.apache.hudi.common.table.timeline.versioning.v2.ActiveTimelineV2.readDataStreamFromPath(ActiveTimelineV2.java:728)
     at 
org.apache.hudi.common.table.timeline.versioning.v2.ActiveTimelineV2.getContentStream(ActiveTimelineV2.java:271)
     at 
org.apache.hudi.common.table.timeline.BaseHoodieTimeline.getInstantContentStream(BaseHoodieTimeline.java:558)
     at 
org.apache.hudi.common.table.timeline.HoodieTimeline.readInstantContent(HoodieTimeline.java:138)
     at 
org.apache.hudi.common.table.timeline.MetadataConversionUtils.getCommitMetadata(MetadataConversionUtils.java:342)
     at 
org.apache.hudi.common.table.timeline.MetadataConversionUtils.createMetaWrapper(MetadataConversionUtils.java:189)
     at 
org.apache.hudi.table.upgrade.EightToSevenDowngradeHandler$ArchiveEntryFlusher.accept(EightToSevenDowngradeHandler.java:262)
     at 
org.apache.hudi.table.upgrade.EightToSevenDowngradeHandler$ArchiveEntryFlusher.accept(EightToSevenDowngradeHandler.java:240)
     at 
org.apache.hudi.common.table.timeline.versioning.v2.ArchivedTimelineLoaderV2.lambda$loadInstants$1(ArchivedTimelineLoaderV2.java:73)
     at 
java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
     at 
java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:177)
     at 
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655)
     at 
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
     at 
java.base/java.util.stream.ForEachOps$ForEachTask.compute(ForEachOps.java:290)
     at 
java.base/java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:746)
     at 
java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
     at 
java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
     at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
     at 
java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
     at 
java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
   Caused by: java.io.FileNotFoundException: File 
file:/tmp/output/20250407/table_comp_test_0_14_1_1_1_0-SNAPSHOT_1744019628/.hoodie/timeline/20250407152428412_20250407152459189.commit
 does not exist
     at 
org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:779)
     at 
org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:1100)
     at 
org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:769)
     at 
org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:462)
     at 
org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker.<init>(ChecksumFileSystem.java:160)
     at 
org.apache.hadoop.fs.ChecksumFileSystem.open(ChecksumFileSystem.java:372)
     at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:976)
     at 
org.apache.hudi.hadoop.fs.HoodieWrapperFileSystem.open(HoodieWrapperFileSystem.java:481)
     at 
org.apache.hudi.storage.hadoop.HoodieHadoopStorage.open(HoodieHadoopStorage.java:149)
     at 
org.apache.hudi.common.table.timeline.versioning.v2.ActiveTimelineV2.readDataStreamFromPath(ActiveTimelineV2.java:726)
     ... 19 more
    ```
   
   ### Impact
   
   NA
   
   ### Risk level (write none, low medium or high below)
   
   low
   
   ### Documentation Update
   
   NA
   
   ### Contributor's checklist
   
   - [ ] Read through [contributor's 
guide](https://hudi.apache.org/contribute/how-to-contribute)
   - [ ] Change Logs and Impact were stated clearly
   - [ ] Adequate tests were added if applicable
   - [ ] CI passed
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to