mahi4uin opened a new issue, #7122:
URL: https://github.com/apache/hudi/issues/7122

   **_Tips before filing an issue_**
   
   - Have you gone through our [FAQs](https://hudi.apache.org/learn/faq/)? Yes
   
   - Join the mailing list to engage in conversations and get faster support at 
[email protected].
   
   - If you have triaged this as a bug, then file an 
[issue](https://issues.apache.org/jira/projects/HUDI/issues) directly.
   
   **Describe the problem you faced**
   
   Hi Team, We have data pipeline that writes data into Hudi tables after 
reading from sources. After few days of successful runs, job started failing 
while archiving the commit timelines. 
   
   
   **To Reproduce**:
   
   We couldn't produce the error, as it happened only once and then all the 
subsequent runs failed after that.
   
   
   **Expected behavior**
   
   A clear and concise description of what you expected to happen.
   
   **Environment Description** : Apache Hudi on EMR cluster
   
   * Hudi version : 2.12-0.7.0-amzn-0.jar
   
   * Spark version : 3.1.1-amzn-0.1
   
   * Hive version :
   
   * Hadoop version : 
   
   * Storage (HDFS/S3/GCS..) : S3
   
   * Running on Docker? (yes/no) : No
   
   
   **Additional context**
   
   Add any other context about the problem here.
   
   **Stacktrace**
   
   
   CAUTION: This email originated from outside of the organization. Do not 
click links or open attachments unless you can confirm the sender and know the 
content is safe.
   
   
   22/10/29 11:05:34 INFO HoodieLogFormat$WriterBuilder: Computed the next log 
version for commits in s3://bucket/dqip_data/.hoodie/archived as 35 with 
write-token 1-0-1
   22/10/29 11:05:34 INFO HoodieLogFormat$WriterBuilder: HoodieLogFile on path 
s3://bucket/dqip_data/.hoodie/archived/.commits_.archive.35_1-0-1
   22/10/29 11:05:34 INFO HoodieTimelineArchiveLog: Archiving instants 
[[==>20221028122716__replacecommit__REQUESTED], 
[==>20221028122716__replacecommit__INFLIGHT], 
[20221028122716__replacecommit__COMPLETED], 
[==>20221028141111__replacecommit__REQUESTED], 
[==>20221028141111__replacecommit__INFLIGHT], 
[20221028141111__replacecommit__COMPLETED], 
[==>20221028141409__replacecommit__REQUESTED], 
[==>20221028141409__replacecommit__INFLIGHT], 
[20221028141409__replacecommit__COMPLETED], 
[==>20221028142716__replacecommit__REQUESTED], 
[==>20221028142716__replacecommit__INFLIGHT], 
[20221028142716__replacecommit__COMPLETED], 
[==>20221028161416__replacecommit__REQUESTED], 
[==>20221028161416__replacecommit__INFLIGHT], 
[20221028161416__replacecommit__COMPLETED], 
[==>20221028162846__replacecommit__REQUESTED], 
[==>20221028162846__replacecommit__INFLIGHT], 
[20221028162846__replacecommit__COMPLETED], 
[==>20221028165845__replacecommit__REQUESTED], 
[==>20221028165845__replacecommit__INFLIGHT], [2022102816
 5845__replacecommit__COMPLETED], 
[==>20221028181416__replacecommit__REQUESTED], 
[==>20221028181416__replacecommit__INFLIGHT], 
[20221028181416__replacecommit__COMPLETED], 
[==>20221028182709__replacecommit__REQUESTED], 
[==>20221028182709__replacecommit__INFLIGHT], 
[20221028182709__replacecommit__COMPLETED], 
[==>20221028201423__replacecommit__REQUESTED], 
[==>20221028201423__replacecommit__INFLIGHT], 
[20221028201423__replacecommit__COMPLETED], 
[==>20221028203806__replacecommit__REQUESTED], 
[==>20221028203806__replacecommit__INFLIGHT], 
[20221028203806__replacecommit__COMPLETED]]
   22/10/29 11:05:34 INFO HoodieTimelineArchiveLog: Wrapper schema 
{"type":"record","name":"HoodieArchivedMetaEntry","namespace":"org.apache.hudi.avro.model","fields":[{"name":"hoodieCommitMetadata","type":["null",{"type":"record","name":"HoodieCommitMetadata","fields":[{"name":"partitionToWriteStats","type":["null",{"type":"map","values":{"type":"array","items":{"type":"record","name":"HoodieWriteStat","fields":[{"name":"fileId","type":["null",{"type":"string","avro.java.string":"String"}],"default":null},{"name":"path","type":["null",{"type":"string","avro.java.string":"String"}],"default":null},{"name":"prevCommit","type":["null",{"type":"string","avro.java.string":"String"}],"default":null},{"name":"numWrites","type":["null","long"],"default":null},{"name":"numDeletes","type":["null","long"],"default":null},{"name":"numUpdateWrites","type":["null","long"],"default":null},{"name":"totalWriteBytes","type":["null","long"],"default":null},{"name":"totalWriteErrors","type":["null","lo
 
ng"],"default":null},{"name":"partitionPath","type":["null",{"type":"string","avro.java.string":"String"}],"default":null},{"name":"totalLogRecords","type":["null","long"],"default":null},{"name":"totalLogFiles","type":["null","long"],"default":null},{"name":"totalUpdatedRecordsCompacted","type":["null","long"],"default":null},{"name":"numInserts","type":["null","long"],"default":null},{"name":"totalLogBlocks","type":["null","long"],"default":null},{"name":"totalCorruptLogBlock","type":["null","long"],"default":null},{"name":"totalRollbackBlocks","type":["null","long"],"default":null},{"name":"fileSizeInBytes","type":["null","long"],"default":null}]}},"avro.java.string":"String"}],"default":null},{"name":"extraMetadata","type":["null",{"type":"map","values":{"type":"string","avro.java.string":"String"},"avro.java.string":"String"}],"default":null},{"name":"version","type":["int","null"],"default":1},{"name":"operationType","type":["null",{"type":"string","avro.java.string":"String"}
 
],"default":null}]}],"default":null},{"name":"hoodieCleanMetadata","type":["null",{"type":"record","name":"HoodieCleanMetadata","fields":[{"name":"startCleanTime","type":{"type":"string","avro.java.string":"String"}},{"name":"timeTakenInMillis","type":"long"},{"name":"totalFilesDeleted","type":"int"},{"name":"earliestCommitToRetain","type":{"type":"string","avro.java.string":"String"}},{"name":"partitionMetadata","type":{"type":"map","values":{"type":"record","name":"HoodieCleanPartitionMetadata","fields":[{"name":"partitionPath","type":{"type":"string","avro.java.string":"String"}},{"name":"policy","type":{"type":"string","avro.java.string":"String"}},{"name":"deletePathPatterns","type":{"type":"array","items":{"type":"string","avro.java.string":"String"}}},{"name":"successDeleteFiles","type":{"type":"array","items":{"type":"string","avro.java.string":"String"}}},{"name":"failedDeleteFiles","type":{"type":"array","items":{"type":"string","avro.java.string":"String"}}}]},"avro.java.
 
string":"String"}},{"name":"version","type":["int","null"],"default":1},{"name":"bootstrapPartitionMetadata","type":["null",{"type":"map","values":"HoodieCleanPartitionMetadata","avro.java.string":"String","default":null}],"default":null}]}],"default":null},{"name":"hoodieCompactionMetadata","type":["null",{"type":"record","name":"HoodieCompactionMetadata","fields":[{"name":"partitionToCompactionWriteStats","type":["null",{"type":"map","values":{"type":"array","items":{"type":"record","name":"HoodieCompactionWriteStat","fields":[{"name":"partitionPath","type":["null",{"type":"string","avro.java.string":"String"}],"default":null},{"name":"totalLogRecords","type":["null","long"],"default":null},{"name":"totalLogFiles","type":["null","long"],"default":null},{"name":"totalUpdatedRecordsCompacted","type":["null","long"],"default":null},{"name":"hoodieWriteStat","type":["null","HoodieWriteStat"],"default":null}]}},"avro.java.string":"String"}]}]}],"default":null},{"name":"hoodieRollbackMe
 
tadata","type":["null",{"type":"record","name":"HoodieRollbackMetadata","fields":[{"name":"startRollbackTime","type":{"type":"string","avro.java.string":"String"}},{"name":"timeTakenInMillis","type":"long"},{"name":"totalFilesDeleted","type":"int"},{"name":"commitsRollback","type":{"type":"array","items":{"type":"string","avro.java.string":"String"}}},{"name":"partitionMetadata","type":{"type":"map","values":{"type":"record","name":"HoodieRollbackPartitionMetadata","fields":[{"name":"partitionPath","type":{"type":"string","avro.java.string":"String"}},{"name":"successDeleteFiles","type":{"type":"array","items":{"type":"string","avro.java.string":"String"}}},{"name":"failedDeleteFiles","type":{"type":"array","items":{"type":"string","avro.java.string":"String"}}},{"name":"rollbackLogFiles","type":["null",{"type":"map","values":"long","avro.java.string":"String"}],"default":null},{"name":"writtenLogFiles","type":["null",{"type":"map","values":"long","avro.java.string":"String"}],"defa
 
ult":null}]},"avro.java.string":"String"}},{"name":"version","type":["int","null"],"default":1},{"name":"instantsRollback","type":{"type":"array","items":{"type":"record","name":"HoodieInstantInfo","fields":[{"name":"commitTime","type":{"type":"string","avro.java.string":"String"}},{"name":"action","type":{"type":"string","avro.java.string":"String"}}]},"default":null},"default":null}]}],"default":null},{"name":"hoodieSavePointMetadata","type":["null",{"type":"record","name":"HoodieSavepointMetadata","fields":[{"name":"savepointedBy","type":{"type":"string","avro.java.string":"String"}},{"name":"savepointedAt","type":"long"},{"name":"comments","type":{"type":"string","avro.java.string":"String"}},{"name":"partitionMetadata","type":{"type":"map","values":{"type":"record","name":"HoodieSavepointPartitionMetadata","fields":[{"name":"partitionPath","type":{"type":"string","avro.java.string":"String"}},{"name":"savepointDataFile","type":{"type":"array","items":{"type":"string","avro.java
 
.string":"String"}}}]},"avro.java.string":"String"}},{"name":"version","type":["int","null"],"default":1}]}],"default":null},{"name":"commitTime","type":["null",{"type":"string","avro.java.string":"String"}],"default":null},{"name":"actionType","type":["null",{"type":"string","avro.java.string":"String"}],"default":null},{"name":"version","type":["int","null"],"default":1},{"name":"hoodieCompactionPlan","type":["null",{"type":"record","name":"HoodieCompactionPlan","fields":[{"name":"operations","type":["null",{"type":"array","items":{"type":"record","name":"HoodieCompactionOperation","fields":[{"name":"baseInstantTime","type":["null",{"type":"string","avro.java.string":"String"}]},{"name":"deltaFilePaths","type":["null",{"type":"array","items":{"type":"string","avro.java.string":"String"}}],"default":null},{"name":"dataFilePath","type":["null",{"type":"string","avro.java.string":"String"}],"default":null},{"name":"fileId","type":["null",{"type":"string","avro.java.string":"String"}]
 
},{"name":"partitionPath","type":["null",{"type":"string","avro.java.string":"String"}],"default":null},{"name":"metrics","type":["null",{"type":"map","values":"double","avro.java.string":"String"}],"default":null},{"name":"bootstrapFilePath","type":["null",{"type":"string","avro.java.string":"String"}],"default":null}]}}],"default":null},{"name":"extraMetadata","type":["null",{"type":"map","values":{"type":"string","avro.java.string":"String"},"avro.java.string":"String"}],"default":null},{"name":"version","type":["int","null"],"default":1}]}],"default":null},{"name":"hoodieCleanerPlan","type":["null",{"type":"record","name":"HoodieCleanerPlan","fields":[{"name":"earliestInstantToRetain","type":["null",{"type":"record","name":"HoodieActionInstant","fields":[{"name":"timestamp","type":{"type":"string","avro.java.string":"String"}},{"name":"action","type":{"type":"string","avro.java.string":"String"}},{"name":"state","type":{"type":"string","avro.java.string":"String"}}]}],"default":
 
null},{"name":"policy","type":{"type":"string","avro.java.string":"String"}},{"name":"filesToBeDeletedPerPartition","type":["null",{"type":"map","values":{"type":"array","items":{"type":"string","avro.java.string":"String"}},"avro.java.string":"String"}],"default":null},{"name":"version","type":["int","null"],"default":1},{"name":"filePathsToBeDeletedPerPartition","type":["null",{"type":"map","values":{"type":"array","items":{"type":"record","name":"HoodieCleanFileInfo","fields":[{"name":"filePath","type":["null",{"type":"string","avro.java.string":"String"}],"default":null},{"name":"isBootstrapBaseFile","type":["null","boolean"],"default":null}]}},"avro.java.string":"String"}],"doc":"This
 field replaces the field 
filesToBeDeletedPerPartition","default":null}]}],"default":null},{"name":"actionState","type":["null",{"type":"string","avro.java.string":"String"}],"default":null},{"name":"hoodieReplaceCommitMetadata","type":["null",{"type":"record","name":"HoodieReplaceCommitMetadata","
 
fields":[{"name":"partitionToWriteStats","type":["null",{"type":"map","values":{"type":"array","items":"HoodieWriteStat"},"avro.java.string":"String"}],"default":null},{"name":"extraMetadata","type":["null",{"type":"map","values":{"type":"string","avro.java.string":"String"},"avro.java.string":"String"}],"default":null},{"name":"version","type":["int","null"],"default":1},{"name":"operationType","type":["null",{"type":"string","avro.java.string":"String"}],"default":null},{"name":"partitionToReplaceFileIds","type":["null",{"type":"map","values":{"type":"array","items":{"type":"string","avro.java.string":"String"}},"avro.java.string":"String"}],"default":null}]}],"default":null}]}
   22/10/29 11:05:34 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:35 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:35 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:35 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:35 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:35 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:35 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:35 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:35 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:35 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:35 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:36 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:36 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:36 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:36 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:36 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:36 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:36 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:36 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:36 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:36 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:36 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:36 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:36 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:37 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:37 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:37 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:37 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:37 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:37 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:37 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:37 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:37 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:37 INFO AbstractTableFileSystemView: Took 2568 ms to read  31 
instants, 36565 replaced file groups
   22/10/29 11:05:37 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:37 WARN ClusteringUtils: No content found in requested file 
for instant [==>20221029110511__replacecommit__INFLIGHT]
   22/10/29 11:05:37 INFO ClusteringUtils: Found 0 files in pending clustering 
operations
   22/10/29 11:05:37 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:38 WARN DAGScheduler: Broadcasting large task binary with 
size 2.2 MiB
   22/10/29 11:05:40 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:40 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:40 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:40 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:40 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:40 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:40 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:40 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:40 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:40 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:41 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:41 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:41 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:41 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:41 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:41 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:41 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:41 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:41 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:41 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:41 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:42 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:42 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:42 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:42 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:42 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:42 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:42 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:43 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:43 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:43 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:43 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:43 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:43 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:43 INFO AbstractTableFileSystemView: Took 3338 ms to read  31 
instants, 36565 replaced file groups
   22/10/29 11:05:43 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:43 WARN ClusteringUtils: No content found in requested file 
for instant [==>20221029110511__replacecommit__INFLIGHT]
   22/10/29 11:05:43 INFO ClusteringUtils: Found 0 files in pending clustering 
operations
   22/10/29 11:05:43 WARN HadoopFileSystemOwner: found no group information for 
livy (auth:SIMPLE), using livy as primary group
   22/10/29 11:05:44 ERROR default: ****************An error occurred while 
calling o177.save.
   : org.apache.hudi.exception.HoodieCommitException: Failed to archive commits
           at 
org.apache.hudi.table.HoodieTimelineArchiveLog.archive(HoodieTimelineArchiveLog.java:322)
           at 
org.apache.hudi.table.HoodieTimelineArchiveLog.archiveIfRequired(HoodieTimelineArchiveLog.java:138)
           at 
org.apache.hudi.client.AbstractHoodieWriteClient.postCommit(AbstractHoodieWriteClient.java:426)
           at 
org.apache.hudi.client.AbstractHoodieWriteClient.commitStats(AbstractHoodieWriteClient.java:188)
           at 
org.apache.hudi.client.SparkRDDWriteClient.commit(SparkRDDWriteClient.java:110)
           at 
org.apache.hudi.HoodieSparkSqlWriter$.commitAndPerformPostOperations(HoodieSparkSqlWriter.scala:442)
           at 
org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:218)
           at 
org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:134)
           at 
org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:46)
           at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
           at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
           at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:90)
           at 
org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:185)
           at 
org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:223)
           at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
           at 
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:220)
           at 
org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:181)
           at 
org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:134)
           at 
org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:133)
           at 
org.apache.spark.sql.DataFrameWriter.$anonfun$runCommand$1(DataFrameWriter.scala:989)
           at 
org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:107)
           at 
org.apache.spark.sql.execution.SQLExecution$.withTracker(SQLExecution.scala:232)
           at 
org.apache.spark.sql.execution.SQLExecution$.executeQuery$1(SQLExecution.scala:110)
           at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$6(SQLExecution.scala:135)
           at 
org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:107)
           at 
org.apache.spark.sql.execution.SQLExecution$.withTracker(SQLExecution.scala:232)
           at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:135)
           at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:253)
           at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:134)
           at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772)
           at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:68)
           at 
org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:989)
           at 
org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:438)
           at 
org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:415)
           at 
org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:293)
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
           at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
           at py4j.Gateway.invoke(Gateway.java:282)
           at 
py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
           at py4j.commands.CallCommand.execute(CallCommand.java:79)
           at py4j.GatewayConnection.run(GatewayConnection.java:238)
           at java.lang.Thread.run(Thread.java:750)
   Caused by: java.lang.IllegalArgumentException: Positive number of partitions 
required
           at 
org.apache.spark.rdd.ParallelCollectionRDD$.slice(ParallelCollectionRDD.scala:118)
           at 
org.apache.spark.rdd.ParallelCollectionRDD.getPartitions(ParallelCollectionRDD.scala:96)
           at org.apache.spark.rdd.RDD.$anonfun$partitions$2(RDD.scala:300)
           at scala.Option.getOrElse(Option.scala:189)
           at org.apache.spark.rdd.RDD.partitions(RDD.scala:296)
           at 
org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:49)
           at org.apache.spark.rdd.RDD.$anonfun$partitions$2(RDD.scala:300)
           at scala.Option.getOrElse(Option.scala:189)
           at org.apache.spark.rdd.RDD.partitions(RDD.scala:296)
           at org.apache.spark.SparkContext.runJob(SparkContext.scala:2303)
           at org.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:1030)
           at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
           at 
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
           at org.apache.spark.rdd.RDD.withScope(RDD.scala:414)
           at org.apache.spark.rdd.RDD.collect(RDD.scala:1029)
           at 
org.apache.spark.api.java.JavaRDDLike.collect(JavaRDDLike.scala:362)
           at 
org.apache.spark.api.java.JavaRDDLike.collect$(JavaRDDLike.scala:361)
           at 
org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:45)
           at 
org.apache.hudi.client.common.HoodieSparkEngineContext.map(HoodieSparkEngineContext.java:73)
           at 
org.apache.hudi.client.ReplaceArchivalHelper.deleteReplacedFileGroups(ReplaceArchivalHelper.java:72)
           at 
org.apache.hudi.table.HoodieTimelineArchiveLog.deleteReplacedFileGroups(HoodieTimelineArchiveLog.java:341)
           at 
org.apache.hudi.table.HoodieTimelineArchiveLog.archive(HoodieTimelineArchiveLog.java:303)
           ... 45 more
   ********************
   22/10/29 11:05:44 ERROR ApplicationMaster: User application exited with 
status 1
    
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to