wardlican opened a new issue, #3502:
URL: https://github.com/apache/amoro/issues/3502
### What happened?
```
2025-04-07 09:07:16,011 INFO org.apache.hadoop.io.compress.CodecPool
[] - Got brand-new compressor [.zstd]
2025-04-07 09:07:15,995 ERROR
org.apache.amoro.optimizer.common.OptimizerExecutor [] - Optimizer
executor[1] executed task[OptimizingTaskId(processId:1743980154550,
taskId:2032)] failed and cost 1828507
java.lang.RuntimeException: Run with ugi request failed.
at org.apache.amoro.table.TableMetaStore.call(TableMetaStore.java:261)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at
org.apache.amoro.table.TableMetaStore.lambda$doAs$0(TableMetaStore.java:231)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at java.security.AccessController.doPrivileged(Native Method)
~[?:1.8.0_322]
at javax.security.auth.Subject.doAs(Subject.java:360) ~[?:1.8.0_322]
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1876)
~[optimizer-job.jar:?]
at org.apache.amoro.table.TableMetaStore.doAs(TableMetaStore.java:231)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at
org.apache.amoro.io.AuthenticatedHadoopFileIO.doAs(AuthenticatedHadoopFileIO.java:201)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at
org.apache.amoro.optimizing.AbstractRewriteFilesExecutor.execute(AbstractRewriteFilesExecutor.java:110)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at
org.apache.amoro.optimizing.AbstractRewriteFilesExecutor.execute(AbstractRewriteFilesExecutor.java:66)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at
org.apache.amoro.optimizer.common.OptimizerExecutor.executeTask(OptimizerExecutor.java:150)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at
org.apache.amoro.optimizer.flink.FlinkOptimizerExecutor.executeTask(FlinkOptimizerExecutor.java:70)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at
org.apache.amoro.optimizer.common.OptimizerExecutor.start(OptimizerExecutor.java:53)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at
org.apache.amoro.optimizer.flink.FlinkExecutor.lambda$open$0(FlinkExecutor.java:64)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at java.lang.Thread.run(Thread.java:750) [?:1.8.0_322]
Caused by: java.lang.OutOfMemoryError: Java heap space
at java.nio.HeapByteBuffer.<init>(HeapByteBuffer.java:57) ~[?:1.8.0_322]
at java.nio.ByteBuffer.allocate(ByteBuffer.java:335) ~[?:1.8.0_322]
at
org.apache.parquet.bytes.HeapByteBufferAllocator.allocate(HeapByteBufferAllocator.java:32)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at
org.apache.parquet.hadoop.ParquetFileReader$ConsecutivePartList.readAll(ParquetFileReader.java:1842)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at
org.apache.parquet.hadoop.ParquetFileReader.internalReadRowGroup(ParquetFileReader.java:990)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at
org.apache.parquet.hadoop.ParquetFileReader.readNextRowGroup(ParquetFileReader.java:940)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at
org.apache.iceberg.parquet.ParquetReader$FileIterator.advance(ParquetReader.java:147)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at
org.apache.iceberg.parquet.ParquetReader$FileIterator.next(ParquetReader.java:126)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at
org.apache.iceberg.io.CloseableIterable$ConcatCloseableIterable$ConcatCloseableIterator.next(CloseableIterable.java:294)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at
org.apache.iceberg.io.CloseableIterable$7$1.next(CloseableIterable.java:202)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at org.apache.iceberg.io.FilterIterator.advance(FilterIterator.java:65)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at org.apache.iceberg.io.FilterIterator.hasNext(FilterIterator.java:49)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at org.apache.iceberg.io.FilterIterator.advance(FilterIterator.java:64)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at org.apache.iceberg.io.FilterIterator.hasNext(FilterIterator.java:49)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at
org.apache.iceberg.io.CloseableIterable$7$1.hasNext(CloseableIterable.java:197)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at
org.apache.amoro.optimizing.AbstractRewriteFilesExecutor.rewriterDataFiles(AbstractRewriteFilesExecutor.java:181)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at
org.apache.amoro.optimizing.AbstractRewriteFilesExecutor$$Lambda$838/1395036290.call(Unknown
Source) ~[?:?]
at org.apache.amoro.table.TableMetaStore.call(TableMetaStore.java:256)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at
org.apache.amoro.table.TableMetaStore.lambda$doAs$0(TableMetaStore.java:231)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at
org.apache.amoro.table.TableMetaStore$$Lambda$839/2019725824.run(Unknown
Source) ~[?:?]
at java.security.AccessController.doPrivileged(Native Method)
~[?:1.8.0_322]
at javax.security.auth.Subject.doAs(Subject.java:360) ~[?:1.8.0_322]
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1876)
~[optimizer-job.jar:?]
at org.apache.amoro.table.TableMetaStore.doAs(TableMetaStore.java:231)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at
org.apache.amoro.io.AuthenticatedHadoopFileIO.doAs(AuthenticatedHadoopFileIO.java:201)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at
org.apache.amoro.optimizing.AbstractRewriteFilesExecutor.execute(AbstractRewriteFilesExecutor.java:110)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at
org.apache.amoro.optimizing.AbstractRewriteFilesExecutor.execute(AbstractRewriteFilesExecutor.java:66)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at
org.apache.amoro.optimizer.common.OptimizerExecutor.executeTask(OptimizerExecutor.java:150)
[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at
org.apache.amoro.optimizer.flink.FlinkOptimizerExecutor.executeTask(FlinkOptimizerExecutor.java:70)
[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at
org.apache.amoro.optimizer.common.OptimizerExecutor.start(OptimizerExecutor.java:53)
[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at
org.apache.amoro.optimizer.flink.FlinkExecutor.lambda$open$0(FlinkExecutor.java:64)
[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at
org.apache.amoro.optimizer.flink.FlinkExecutor$$Lambda$755/1780652282.run(Unknown
Source)
[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
2025-04-07 09:07:16,088 ERROR
org.apache.amoro.optimizer.common.OptimizerExecutor [] - Optimizer
executor[1] got an unexpected error
java.lang.IllegalStateException: Operator is stopped
at
org.apache.amoro.optimizer.common.AbstractOptimizerOperator.callAuthenticatedAms(AbstractOptimizerOperator.java:118)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at
org.apache.amoro.optimizer.common.OptimizerExecutor.completeTask(OptimizerExecutor.java:109)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at
org.apache.amoro.optimizer.common.OptimizerExecutor.start(OptimizerExecutor.java:54)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at
org.apache.amoro.optimizer.flink.FlinkExecutor.lambda$open$0(FlinkExecutor.java:64)
~[blob_p-92e0bdadd184d6267f2e48cf366e76896a7d009d-7c041a40000d780026896635e965ccd0:?]
at java.lang.Thread.run(Thread.java:750) [?:1.8.0_322]
2025-04-07 09:07:16,093 INFO
org.apache.amoro.shade.zookeeper3.org.apache.curator.framework.state.ConnectionStateManager
[] - State change: `SUSPENDED`
```
### Affects Versions
0.7.1
### What table formats are you seeing the problem on?
_No response_
### What engines are you seeing the problem on?
_No response_
### How to reproduce
_No response_
### Relevant log output
```shell
```
### Anything else
_No response_
### Are you willing to submit a PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [x] I agree to follow this project's Code of Conduct
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]