[ 
https://issues.apache.org/jira/browse/HIVE-27241?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17733862#comment-17733862
 ] 

okumin commented on HIVE-27241:
-------------------------------

[~dharmikt] What version or environment did you test? I didn't reproduce the 
issue using `4.0.0-alpha-2` or the latest master branch.

 
{code:java}
0: jdbc:hive2://hive-hiveserver2:10000/defaul> create table test_dt (id int) 
stored by iceberg stored as orc 
tblproperties('write.orc.compression-codec'='zstd');
...
No rows affected (0.717 seconds)
0: jdbc:hive2://hive-hiveserver2:10000/defaul> insert into test_dt values (1); 
...
----------------------------------------------------------------------------------------------
        VERTICES      MODE        STATUS  TOTAL  COMPLETED  RUNNING  PENDING  
FAILED  KILLED  
----------------------------------------------------------------------------------------------
        VERTICES      MODE        STATUS  TOTAL  COMPLETED  RUNNING  PENDING  
FAILED  KILLED  
----------------------------------------------------------------------------------------------
Map 1 .......... container     SUCCEEDED      1          1        0        0    
   0       0  
Reducer 2 ...... container     SUCCEEDED      1          1        0        0    
   0       0  
----------------------------------------------------------------------------------------------
VERTICES: 02/02  [==========================>>] 100%  ELAPSED TIME: 7.54 s     
----------------------------------------------------------------------------------------------
INFO  : Starting task [Stage-3:STATS] in serial mode
INFO  : Executing stats task
INFO  : Table default.test_dt stats: [numFiles=1, numRows=1, totalSize=246, 
rawDataSize=0, numFilesErasureCoded=0]
INFO  : Completed executing 
command(queryId=hive_20230618063054_cc11b0af-75ec-464e-a7c4-10a90622c047); Time 
taken: 11.334 seconds
1 row affected (11.52 seconds)
0: jdbc:hive2://hive-hiveserver2:10000/defaul> select * from test_dt;
...
+-------------+
| test_dt.id  |
+-------------+
| 1           |
+-------------+
1 row selected (0.465 seconds){code}
 

> insert queries failing for iceberg table stored with orc using zstd 
> compression codec.
> --------------------------------------------------------------------------------------
>
>                 Key: HIVE-27241
>                 URL: https://issues.apache.org/jira/browse/HIVE-27241
>             Project: Hive
>          Issue Type: Bug
>          Components: Iceberg integration
>            Reporter: Dharmik Thakkar
>            Priority: Major
>
> insert queries failing for iceberg table stored with orc using zstd 
> compression codec.
> {code:java}
> create table test_dt (id int) stored by iceberg stored as orc 
> tblproperties('write.orc.compression-codec'='zstd');
> insert into test_dt values (1); {code}
> {code:java}
> Error while compiling statement: FAILED: Execution Error, return code 2 from 
> org.apache.hadoop.hive.ql.exec.tez.TezTask. Vertex failed, vertexName=Map 1, 
> vertexId=vertex_1681195782720_0001_2_00, diagnostics=[Task failed, 
> taskId=task_1681195782720_0001_2_00_000000, diagnostics=[TaskAttempt 0 
> failed, info=[Error: Error while running task ( failure ) : 
> attempt_1681195782720_0001_2_00_000000_0:java.lang.RuntimeException: 
> java.lang.RuntimeException: java.lang.NoClassDefFoundError: 
> io/airlift/compress/zstd/ZstdCompressor at 
> org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:351)
>  at 
> org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:280) at 
> org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:374)
>  at 
> org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:84)
>  at 
> org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:70)
>  at java.base/java.security.AccessController.doPrivileged(Native Method) at 
> java.base/javax.security.auth.Subject.doAs(Subject.java:423) at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1899)
>  at 
> org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:70)
>  at 
> org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:40)
>  at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36) at 
> org.apache.hadoop.hive.llap.daemon.impl.StatsRecordingThreadPool$WrappedCallable.call(StatsRecordingThreadPool.java:118)
>  at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at 
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
>  at 
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
>  at java.base/java.lang.Thread.run(Thread.java:829) Caused by: 
> java.lang.RuntimeException: java.lang.NoClassDefFoundError: 
> io/airlift/compress/zstd/ZstdCompressor at 
> org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:101)
>  at 
> org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.pushRecord(MapRecordSource.java:76)
>  at 
> org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.run(MapRecordProcessor.java:437)
>  at 
> org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:297)
>  ... 15 more Caused by: java.lang.NoClassDefFoundError: 
> io/airlift/compress/zstd/ZstdCompressor at 
> org.apache.hive.iceberg.org.apache.orc.impl.WriterImpl.createCodec(WriterImpl.java:281)
>  at 
> org.apache.hive.iceberg.org.apache.orc.impl.OrcCodecPool.getCodec(OrcCodecPool.java:56)
>  at 
> org.apache.hive.iceberg.org.apache.orc.impl.PhysicalFsWriter.<init>(PhysicalFsWriter.java:116)
>  at 
> org.apache.hive.iceberg.org.apache.orc.impl.PhysicalFsWriter.<init>(PhysicalFsWriter.java:94)
>  at 
> org.apache.hive.iceberg.org.apache.orc.impl.WriterImpl.<init>(WriterImpl.java:220)
>  at 
> org.apache.hive.iceberg.org.apache.orc.OrcFile.createWriter(OrcFile.java:1010)
>  at 
> org.apache.iceberg.orc.OrcFileAppender.newOrcWriter(OrcFileAppender.java:171) 
> at org.apache.iceberg.orc.OrcFileAppender.<init>(OrcFileAppender.java:90) at 
> org.apache.iceberg.orc.ORC$WriteBuilder.build(ORC.java:210) at 
> org.apache.iceberg.orc.ORC$DataWriteBuilder.build(ORC.java:405) at 
> org.apache.iceberg.data.BaseFileWriterFactory.newDataWriter(BaseFileWriterFactory.java:136)
>  at 
> org.apache.iceberg.io.RollingDataWriter.newWriter(RollingDataWriter.java:49) 
> at 
> org.apache.iceberg.io.RollingDataWriter.newWriter(RollingDataWriter.java:33) 
> at 
> org.apache.iceberg.io.RollingFileWriter.openCurrentWriter(RollingFileWriter.java:104)
>  at org.apache.iceberg.io.RollingDataWriter.<init>(RollingDataWriter.java:44) 
> at 
> org.apache.iceberg.io.ClusteredDataWriter.newWriter(ClusteredDataWriter.java:51)
>  at org.apache.iceberg.io.ClusteredWriter.write(ClusteredWriter.java:87) at 
> org.apache.iceberg.io.ClusteredDataWriter.write(ClusteredDataWriter.java:32) 
> at 
> org.apache.iceberg.mr.hive.writer.HiveIcebergRecordWriter.write(HiveIcebergRecordWriter.java:53)
>  at 
> org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:1168)
>  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:937) at 
> org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95) 
> at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:937) at 
> org.apache.hadoop.hive.ql.exec.UDTFOperator.forwardUDTFOutput(UDTFOperator.java:133)
>  at 
> org.apache.hadoop.hive.ql.udf.generic.UDTFCollector.collect(UDTFCollector.java:45)
>  at 
> org.apache.hadoop.hive.ql.udf.generic.GenericUDTF.forward(GenericUDTF.java:110)
>  at 
> org.apache.hadoop.hive.ql.udf.generic.GenericUDTFInline.process(GenericUDTFInline.java:64)
>  at 
> org.apache.hadoop.hive.ql.exec.UDTFOperator.process(UDTFOperator.java:116) at 
> org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:937) at 
> org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95) 
> at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:937) at 
> org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:174)
>  at 
> org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:154)
>  at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:559) 
> at 
> org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:92)
>  ... 18 more Caused by: java.lang.ClassNotFoundException: 
> io.airlift.compress.zstd.ZstdCompressor at 
> java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581)
>  at 
> java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
>  at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522) ... 53 
> more ], TaskAttempt 1 failed, info=[Error: Error while running task ( failure 
> ) : attempt_1681195782720_0001_2_00_000000_1:java.lang.RuntimeException: 
> java.lang.RuntimeException: java.lang.NoClassDefFoundError: 
> io/airlift/compress/zstd/ZstdCompressor at 
> org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:351)
>  at 
> org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:280) at 
> org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:374)
>  at 
> org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:84)
>  at 
> org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:70)
>  at java.base/java.security.AccessController.doPrivileged(Native Method) at 
> java.base/javax.security.auth.Subject.doAs(Subject.java:423) at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1899)
>  at 
> org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:70)
>  at 
> org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:40)
>  at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36) at 
> org.apache.hadoop.hive.llap.daemon.impl.StatsRecordingThreadPool$WrappedCallable.call(StatsRecordingThreadPool.java:118)
>  at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at 
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
>  at 
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
>  at java.base/java.lang.Thread.run(Thread.java:829) Caused by: 
> java.lang.RuntimeException: java.lang.NoClassDefFoundError: 
> io/airlift/compress/zstd/ZstdCompressor at 
> org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:101)
>  at 
> org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.pushRecord(MapRecordSource.java:76)
>  at 
> org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.run(MapRecordProcessor.java:437)
>  at 
> org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:297)
>  ... 15 more Caused by: java.lang.NoClassDefFoundError: 
> io/airlift/compress/zstd/ZstdCompressor at 
> org.apache.hive.iceberg.org.apache.orc.impl.WriterImpl.createCodec(WriterImpl.java:281)
>  at 
> org.apache.hive.iceberg.org.apache.orc.impl.OrcCodecPool.getCodec(OrcCodecPool.java:56)
>  at 
> org.apache.hive.iceberg.org.apache.orc.impl.PhysicalFsWriter.<init>(PhysicalFsWriter.java:116)
>  at 
> org.apache.hive.iceberg.org.apache.orc.impl.PhysicalFsWriter.<init>(PhysicalFsWriter.java:94)
>  at 
> org.apache.hive.iceberg.org.apache.orc.impl.WriterImpl.<init>(WriterImpl.java:220)
>  at 
> org.apache.hive.iceberg.org.apache.orc.OrcFile.createWriter(OrcFile.java:1010)
>  at 
> org.apache.iceberg.orc.OrcFileAppender.newOrcWriter(OrcFileAppender.java:171) 
> at org.apache.iceberg.orc.OrcFileAppender.<init>(OrcFileAppender.java:90) at 
> org.apache.iceberg.orc.ORC$WriteBuilder.build(ORC.java:210) at 
> org.apache.iceberg.orc.ORC$DataWriteBuilder.build(ORC.java:405) at 
> org.apache.iceberg.data.BaseFileWriterFactory.newDataWriter(BaseFileWriterFactory.java:136)
>  at 
> org.apache.iceberg.io.RollingDataWriter.newWriter(RollingDataWriter.java:49) 
> at 
> org.apache.iceberg.io.RollingDataWriter.newWriter(RollingDataWriter.java:33) 
> at 
> org.apache.iceberg.io.RollingFileWriter.openCurrentWriter(RollingFileWriter.java:104)
>  at org.apache.iceberg.io.RollingDataWriter.<init>(RollingDataWriter.java:44) 
> at 
> org.apache.iceberg.io.ClusteredDataWriter.newWriter(ClusteredDataWriter.java:51)
>  at org.apache.iceberg.io.ClusteredWriter.write(ClusteredWriter.java:87) at 
> org.apache.iceberg.io.ClusteredDataWriter.write(ClusteredDataWriter.java:32) 
> at 
> org.apache.iceberg.mr.hive.writer.HiveIcebergRecordWriter.write(HiveIcebergRecordWriter.java:53)
>  at 
> org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:1168)
>  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:937) at 
> org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95) 
> at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:937) at 
> org.apache.hadoop.hive.ql.exec.UDTFOperator.forwardUDTFOutput(UDTFOperator.java:133)
>  at 
> org.apache.hadoop.hive.ql.udf.generic.UDTFCollector.collect(UDTFCollector.java:45)
>  at 
> org.apache.hadoop.hive.ql.udf.generic.GenericUDTF.forward(GenericUDTF.java:110)
>  at 
> org.apache.hadoop.hive.ql.udf.generic.GenericUDTFInline.process(GenericUDTFInline.java:64)
>  at 
> org.apache.hadoop.hive.ql.exec.UDTFOperator.process(UDTFOperator.java:116) at 
> org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:937) at 
> org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95) 
> at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:937) at 
> org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:174)
>  at 
> org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:154)
>  at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:559) 
> at 
> org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:92)
>  ... 18 more Caused by: java.lang.ClassNotFoundException: 
> io.airlift.compress.zstd.ZstdCompressor at 
> java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581)
>  at 
> java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
>  at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522) ... 53 
> more ], TaskAttempt 2 failed, info=[Error: Error while running task ( failure 
> ) : java.lang.RuntimeException: java.lang.NoClassDefFoundError: 
> io/airlift/compress/zstd/ZstdCompressor at 
> org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:101)
>  at 
> org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.pushRecord(MapRecordSource.java:76)
>  at 
> org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.run(MapRecordProcessor.java:437)
>  at 
> org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:297)
>  at 
> org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:280) at 
> org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:374)
>  at 
> org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:84)
>  at 
> org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:70)
>  at java.base/java.security.AccessController.doPrivileged(Native Method) at 
> java.base/javax.security.auth.Subject.doAs(Subject.java:423) at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1899)
>  at 
> org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:70)
>  at 
> org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:40)
>  at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36) at 
> org.apache.hadoop.hive.llap.daemon.impl.StatsRecordingThreadPool$WrappedCallable.call(StatsRecordingThreadPool.java:118)
>  at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at 
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
>  at 
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
>  at java.base/java.lang.Thread.run(Thread.java:829) Caused by: 
> java.lang.NoClassDefFoundError: io/airlift/compress/zstd/ZstdCompressor at 
> org.apache.hive.iceberg.org.apache.orc.impl.WriterImpl.createCodec(WriterImpl.java:281)
>  at 
> org.apache.hive.iceberg.org.apache.orc.impl.OrcCodecPool.getCodec(OrcCodecPool.java:56)
>  at 
> org.apache.hive.iceberg.org.apache.orc.impl.PhysicalFsWriter.<init>(PhysicalFsWriter.java:116)
>  at 
> org.apache.hive.iceberg.org.apache.orc.impl.PhysicalFsWriter.<init>(PhysicalFsWriter.java:94)
>  at 
> org.apache.hive.iceberg.org.apache.orc.impl.WriterImpl.<init>(WriterImpl.java:220)
>  at 
> org.apache.hive.iceberg.org.apache.orc.OrcFile.createWriter(OrcFile.java:1010)
>  at 
> org.apache.iceberg.orc.OrcFileAppender.newOrcWriter(OrcFileAppender.java:171) 
> at org.apache.iceberg.orc.OrcFileAppender.<init>(OrcFileAppender.java:90) at 
> org.apache.iceberg.orc.ORC$WriteBuilder.build(ORC.java:210) at 
> org.apache.iceberg.orc.ORC$DataWriteBuilder.build(ORC.java:405) at 
> org.apache.iceberg.data.BaseFileWriterFactory.newDataWriter(BaseFileWriterFactory.java:136)
>  at 
> org.apache.iceberg.io.RollingDataWriter.newWriter(RollingDataWriter.java:49) 
> at 
> org.apache.iceberg.io.RollingDataWriter.newWriter(RollingDataWriter.java:33) 
> at 
> org.apache.iceberg.io.RollingFileWriter.openCurrentWriter(RollingFileWriter.java:104)
>  at org.apache.iceberg.io.RollingDataWriter.<init>(RollingDataWriter.java:44) 
> at 
> org.apache.iceberg.io.ClusteredDataWriter.newWriter(ClusteredDataWriter.java:51)
>  at org.apache.iceberg.io.ClusteredWriter.write(ClusteredWriter.java:87) at 
> org.apache.iceberg.io.ClusteredDataWriter.write(ClusteredDataWriter.java:32) 
> at 
> org.apache.iceberg.mr.hive.writer.HiveIcebergRecordWriter.write(HiveIcebergRecordWriter.java:53)
>  at 
> org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:1168)
>  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:937) at 
> org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95) 
> at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:937) at 
> org.apache.hadoop.hive.ql.exec.UDTFOperator.forwardUDTFOutput(UDTFOperator.java:133)
>  at 
> org.apache.hadoop.hive.ql.udf.generic.UDTFCollector.collect(UDTFCollector.java:45)
>  at 
> org.apache.hadoop.hive.ql.udf.generic.GenericUDTF.forward(GenericUDTF.java:110)
>  at 
> org.apache.hadoop.hive.ql.udf.generic.GenericUDTFInline.process(GenericUDTFInline.java:64)
>  at 
> org.apache.hadoop.hive.ql.exec.UDTFOperator.process(UDTFOperator.java:116) at 
> org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:937) at 
> org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95) 
> at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:937) at 
> org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:174)
>  at 
> org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:154)
>  at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:559) 
> at 
> org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:92)
>  ... 18 more , errorMessage=Cannot recover from this 
> error:java.lang.RuntimeException: java.lang.NoClassDefFoundError: 
> io/airlift/compress/zstd/ZstdCompressor at 
> org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:101)
>  at 
> org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.pushRecord(MapRecordSource.java:76)
>  at 
> org.apache.hadoop.hive.ql.exec.tez.MapRecordProcessor.run(MapRecordProcessor.java:437)
>  at 
> org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:297)
>  at 
> org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:280) at 
> org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:374)
>  at 
> org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:84)
>  at 
> org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:70)
>  at java.base/java.security.AccessController.doPrivileged(Native Method) at 
> java.base/javax.security.auth.Subject.doAs(Subject.java:423) at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1899)
>  at 
> org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:70)
>  at 
> org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:40)
>  at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36) at 
> org.apache.hadoop.hive.llap.daemon.impl.StatsRecordingThreadPool$WrappedCallable.call(StatsRecordingThreadPool.java:118)
>  at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at 
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
>  at 
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
>  at java.base/java.lang.Thread.run(Thread.java:829) Caused by: 
> java.lang.NoClassDefFoundError: io/airlift/compress/zstd/ZstdCompressor at 
> org.apache.hive.iceberg.org.apache.orc.impl.WriterImpl.createCodec(WriterImpl.java:281)
>  at 
> org.apache.hive.iceberg.org.apache.orc.impl.OrcCodecPool.getCodec(OrcCodecPool.java:56)
>  at 
> org.apache.hive.iceberg.org.apache.orc.impl.PhysicalFsWriter.<init>(PhysicalFsWriter.java:116)
>  at 
> org.apache.hive.iceberg.org.apache.orc.impl.PhysicalFsWriter.<init>(PhysicalFsWriter.java:94)
>  at 
> org.apache.hive.iceberg.org.apache.orc.impl.WriterImpl.<init>(WriterImpl.java:220)
>  at 
> org.apache.hive.iceberg.org.apache.orc.OrcFile.createWriter(OrcFile.java:1010)
>  at 
> org.apache.iceberg.orc.OrcFileAppender.newOrcWriter(OrcFileAppender.java:171) 
> at org.apache.iceberg.orc.OrcFileAppender.<init>(OrcFileAppender.java:90) at 
> org.apache.iceberg.orc.ORC$WriteBuilder.build(ORC.java:210) at 
> org.apache.iceberg.orc.ORC$DataWriteBuilder.build(ORC.java:405) at 
> org.apache.iceberg.data.BaseFileWriterFactory.newDataWriter(BaseFileWriterFactory.java:136)
>  at 
> org.apache.iceberg.io.RollingDataWriter.newWriter(RollingDataWriter.java:49) 
> at 
> org.apache.iceberg.io.RollingDataWriter.newWriter(RollingDataWriter.java:33) 
> at 
> org.apache.iceberg.io.RollingFileWriter.openCurrentWriter(RollingFileWriter.java:104)
>  at org.apache.iceberg.io.RollingDataWriter.<init>(RollingDataWriter.java:44) 
> at 
> org.apache.iceberg.io.ClusteredDataWriter.newWriter(ClusteredDataWriter.java:51)
>  at org.apache.iceberg.io.ClusteredWriter.write(ClusteredWriter.java:87) at 
> org.apache.iceberg.io.ClusteredDataWriter.write(ClusteredDataWriter.java:32) 
> at 
> org.apache.iceberg.mr.hive.writer.HiveIcebergRecordWriter.write(HiveIcebergRecordWriter.java:53)
>  at 
> org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:1168)
>  at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:937) at 
> org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95) 
> at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:937) at 
> org.apache.hadoop.hive.ql.exec.UDTFOperator.forwardUDTFOutput(UDTFOperator.java:133)
>  at 
> org.apache.hadoop.hive.ql.udf.generic.UDTFCollector.collect(UDTFCollector.java:45)
>  at 
> org.apache.hadoop.hive.ql.udf.generic.GenericUDTF.forward(GenericUDTF.java:110)
>  at 
> org.apache.hadoop.hive.ql.udf.generic.GenericUDTFInline.process(GenericUDTFInline.java:64)
>  at 
> org.apache.hadoop.hive.ql.exec.UDTFOperator.process(UDTFOperator.java:116) at 
> org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:937) at 
> org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95) 
> at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:937) at 
> org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:174)
>  at 
> org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:154)
>  at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:559) 
> at 
> org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:92)
>  ... 18 more ]], Vertex did not succeed due to OWN_TASK_FAILURE, 
> failedTasks:1 killedTasks:0, Vertex vertex_1681195782720_0001_2_00 [Map 1] 
> killed/failed due to:OWN_TASK_FAILURE]Vertex killed, vertexName=Reducer 2, 
> vertexId=vertex_1681195782720_0001_2_01, diagnostics=[Vertex received Kill 
> while in RUNNING state., Vertex did not succeed due to OTHER_VERTEX_FAILURE, 
> failedTasks:0 killedTasks:1, Vertex vertex_1681195782720_0001_2_01 [Reducer 
> 2] killed/failed due to:OTHER_VERTEX_FAILURE]DAG did not succeed due to 
> VERTEX_FAILURE. failedVertices:1 killedVertices:1  {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to